Oct 03 09:43:36 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 09:43:37 crc restorecon[4719]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:37 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 09:43:38 crc restorecon[4719]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 09:43:38 crc kubenswrapper[4990]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 09:43:38 crc kubenswrapper[4990]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 09:43:38 crc kubenswrapper[4990]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 09:43:38 crc kubenswrapper[4990]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 09:43:38 crc kubenswrapper[4990]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 09:43:38 crc kubenswrapper[4990]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.622640 4990 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629216 4990 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629250 4990 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629263 4990 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629273 4990 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629283 4990 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629293 4990 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629302 4990 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629310 4990 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629320 4990 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629328 4990 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629336 4990 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629345 4990 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629354 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629361 4990 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629370 4990 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629378 4990 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629387 4990 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629395 4990 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629404 4990 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629412 4990 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629423 4990 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629434 4990 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629442 4990 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629453 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629461 4990 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629471 4990 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629480 4990 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629501 4990 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629536 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629546 4990 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629554 4990 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629563 4990 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629571 4990 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629579 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629587 4990 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629595 4990 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629604 4990 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629612 4990 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629621 4990 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629629 4990 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629637 4990 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629645 4990 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629653 4990 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629661 4990 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629670 4990 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629677 4990 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629685 4990 feature_gate.go:330] unrecognized feature gate: Example Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629694 4990 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629701 4990 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629709 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629718 4990 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629726 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629739 4990 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629749 4990 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629758 4990 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629767 4990 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629776 4990 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629784 4990 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629793 4990 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629801 4990 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629809 4990 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629817 4990 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629824 4990 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629833 4990 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629840 4990 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629848 4990 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629856 4990 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629864 4990 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629872 4990 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629880 4990 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.629888 4990 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630030 4990 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630047 4990 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630063 4990 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630075 4990 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630086 4990 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630095 4990 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630107 4990 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630120 4990 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630129 4990 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630138 4990 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630148 4990 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630158 4990 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630167 4990 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630177 4990 flags.go:64] FLAG: --cgroup-root="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630188 4990 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630197 4990 flags.go:64] FLAG: --client-ca-file="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630206 4990 flags.go:64] FLAG: --cloud-config="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630216 4990 flags.go:64] FLAG: --cloud-provider="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630226 4990 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630236 4990 flags.go:64] FLAG: --cluster-domain="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630245 4990 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630254 4990 flags.go:64] FLAG: --config-dir="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630263 4990 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630273 4990 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630284 4990 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630294 4990 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630303 4990 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630312 4990 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630321 4990 flags.go:64] FLAG: --contention-profiling="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630330 4990 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630340 4990 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630350 4990 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630359 4990 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630378 4990 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630388 4990 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630396 4990 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630406 4990 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630415 4990 flags.go:64] FLAG: --enable-server="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630423 4990 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630435 4990 flags.go:64] FLAG: --event-burst="100" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630445 4990 flags.go:64] FLAG: --event-qps="50" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630454 4990 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630463 4990 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630473 4990 flags.go:64] FLAG: --eviction-hard="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630484 4990 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630493 4990 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630504 4990 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630541 4990 flags.go:64] FLAG: --eviction-soft="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630550 4990 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630559 4990 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630568 4990 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630577 4990 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630586 4990 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630595 4990 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630604 4990 flags.go:64] FLAG: --feature-gates="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630615 4990 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630624 4990 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630633 4990 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630643 4990 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630652 4990 flags.go:64] FLAG: --healthz-port="10248" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630662 4990 flags.go:64] FLAG: --help="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630670 4990 flags.go:64] FLAG: --hostname-override="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630679 4990 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630689 4990 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630698 4990 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630707 4990 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630716 4990 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630726 4990 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630734 4990 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630743 4990 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630752 4990 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630761 4990 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630771 4990 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630780 4990 flags.go:64] FLAG: --kube-reserved="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630789 4990 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630798 4990 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630849 4990 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630860 4990 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630869 4990 flags.go:64] FLAG: --lock-file="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630880 4990 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630889 4990 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630900 4990 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630915 4990 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630924 4990 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630933 4990 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630942 4990 flags.go:64] FLAG: --logging-format="text" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630952 4990 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630961 4990 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630970 4990 flags.go:64] FLAG: --manifest-url="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630979 4990 flags.go:64] FLAG: --manifest-url-header="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.630990 4990 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631000 4990 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631011 4990 flags.go:64] FLAG: --max-pods="110" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631020 4990 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631029 4990 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631038 4990 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631047 4990 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631057 4990 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631065 4990 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631075 4990 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631095 4990 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631104 4990 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631115 4990 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631126 4990 flags.go:64] FLAG: --pod-cidr="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631135 4990 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631148 4990 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631157 4990 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631167 4990 flags.go:64] FLAG: --pods-per-core="0" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631176 4990 flags.go:64] FLAG: --port="10250" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631185 4990 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631194 4990 flags.go:64] FLAG: --provider-id="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631203 4990 flags.go:64] FLAG: --qos-reserved="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631212 4990 flags.go:64] FLAG: --read-only-port="10255" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631222 4990 flags.go:64] FLAG: --register-node="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631231 4990 flags.go:64] FLAG: --register-schedulable="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631243 4990 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631258 4990 flags.go:64] FLAG: --registry-burst="10" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631267 4990 flags.go:64] FLAG: --registry-qps="5" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631276 4990 flags.go:64] FLAG: --reserved-cpus="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631285 4990 flags.go:64] FLAG: --reserved-memory="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631296 4990 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631305 4990 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631314 4990 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631323 4990 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631332 4990 flags.go:64] FLAG: --runonce="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631341 4990 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631350 4990 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631360 4990 flags.go:64] FLAG: --seccomp-default="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631369 4990 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631378 4990 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631387 4990 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631397 4990 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631406 4990 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631415 4990 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631425 4990 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631434 4990 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631443 4990 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631453 4990 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631462 4990 flags.go:64] FLAG: --system-cgroups="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631471 4990 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631484 4990 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631493 4990 flags.go:64] FLAG: --tls-cert-file="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631502 4990 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631544 4990 flags.go:64] FLAG: --tls-min-version="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631554 4990 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631562 4990 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631572 4990 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631581 4990 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631590 4990 flags.go:64] FLAG: --v="2" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631602 4990 flags.go:64] FLAG: --version="false" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631613 4990 flags.go:64] FLAG: --vmodule="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631626 4990 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.631636 4990 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631840 4990 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631851 4990 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631860 4990 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631869 4990 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631877 4990 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631885 4990 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631893 4990 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631901 4990 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631909 4990 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631917 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631925 4990 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631933 4990 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631941 4990 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631948 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631956 4990 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631964 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631972 4990 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631980 4990 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631987 4990 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.631995 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632004 4990 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632017 4990 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632025 4990 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632033 4990 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632040 4990 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632048 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632056 4990 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632064 4990 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632072 4990 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632079 4990 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632088 4990 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632096 4990 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632104 4990 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632112 4990 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632121 4990 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632129 4990 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632139 4990 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632149 4990 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632158 4990 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632166 4990 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632176 4990 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632184 4990 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632192 4990 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632221 4990 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632229 4990 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632236 4990 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632244 4990 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632252 4990 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632261 4990 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632268 4990 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632276 4990 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632284 4990 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632292 4990 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632308 4990 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632318 4990 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632326 4990 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632334 4990 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632341 4990 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632349 4990 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632357 4990 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632365 4990 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632372 4990 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632383 4990 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632391 4990 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632399 4990 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632407 4990 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632441 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632452 4990 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632462 4990 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632474 4990 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.632485 4990 feature_gate.go:330] unrecognized feature gate: Example Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.633471 4990 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.647044 4990 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.647103 4990 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647243 4990 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647262 4990 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647272 4990 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647282 4990 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647291 4990 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647300 4990 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647310 4990 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647319 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647328 4990 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647338 4990 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647347 4990 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647356 4990 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647366 4990 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647375 4990 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647384 4990 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647393 4990 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647402 4990 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647414 4990 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647428 4990 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647437 4990 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647447 4990 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647456 4990 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647465 4990 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647473 4990 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647482 4990 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647490 4990 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647502 4990 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647552 4990 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647564 4990 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647573 4990 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647583 4990 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647594 4990 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647603 4990 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647613 4990 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647625 4990 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647635 4990 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647643 4990 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647656 4990 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647667 4990 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647676 4990 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647684 4990 feature_gate.go:330] unrecognized feature gate: Example Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647693 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647702 4990 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647710 4990 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647719 4990 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647729 4990 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647737 4990 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647747 4990 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647757 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647767 4990 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647776 4990 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647784 4990 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647793 4990 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647801 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647810 4990 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647818 4990 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647827 4990 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647835 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647843 4990 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647852 4990 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647861 4990 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647869 4990 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647877 4990 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647885 4990 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647894 4990 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647903 4990 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647911 4990 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647919 4990 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647928 4990 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647936 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.647946 4990 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.647961 4990 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648232 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648249 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648259 4990 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648268 4990 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648276 4990 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648285 4990 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648293 4990 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648302 4990 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648311 4990 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648322 4990 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648333 4990 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648346 4990 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648357 4990 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648366 4990 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648375 4990 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648384 4990 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648392 4990 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648400 4990 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648410 4990 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648419 4990 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648427 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648435 4990 feature_gate.go:330] unrecognized feature gate: Example Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648444 4990 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648453 4990 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648461 4990 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648471 4990 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648482 4990 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648491 4990 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648500 4990 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648532 4990 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648541 4990 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648550 4990 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648559 4990 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648567 4990 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648577 4990 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648586 4990 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648594 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648602 4990 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648611 4990 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648620 4990 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648629 4990 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648638 4990 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648649 4990 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648658 4990 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648667 4990 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648675 4990 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648684 4990 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648692 4990 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648700 4990 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648709 4990 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648717 4990 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648726 4990 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648734 4990 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648743 4990 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648752 4990 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648760 4990 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648769 4990 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648778 4990 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648786 4990 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648795 4990 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648803 4990 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648811 4990 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648820 4990 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648828 4990 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648839 4990 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648851 4990 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648860 4990 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648869 4990 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648879 4990 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648890 4990 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.648902 4990 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.648915 4990 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.649888 4990 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.655768 4990 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.655939 4990 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.658276 4990 server.go:997] "Starting client certificate rotation" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.658319 4990 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.658576 4990 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-03 12:30:18.881985679 +0000 UTC Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.658758 4990 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2210h46m40.223233339s for next certificate rotation Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.683993 4990 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.687892 4990 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.711618 4990 log.go:25] "Validated CRI v1 runtime API" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.765795 4990 log.go:25] "Validated CRI v1 image API" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.768875 4990 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.774935 4990 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-09-38-55-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.774989 4990 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.790363 4990 manager.go:217] Machine: {Timestamp:2025-10-03 09:43:38.788371122 +0000 UTC m=+0.585002999 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:1dbe54b5-0a5d-46a2-9c08-21093914202d BootID:8954c5f5-a70f-4fb3-9378-33cf06a3d6b1 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:1a:3d:89 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:1a:3d:89 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:2d:ca:9f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d4:c1:2f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2f:2b:6f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:85:56:0c Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:b7:b0:e9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:32:c4:69:be:ba:61 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:06:07:54:f0:1d:dc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.790647 4990 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.790901 4990 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.791554 4990 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.791818 4990 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.791876 4990 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.792177 4990 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.792190 4990 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.792752 4990 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.792789 4990 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.793100 4990 state_mem.go:36] "Initialized new in-memory state store" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.793225 4990 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.796305 4990 kubelet.go:418] "Attempting to sync node with API server" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.796342 4990 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.796390 4990 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.796411 4990 kubelet.go:324] "Adding apiserver pod source" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.796427 4990 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.801106 4990 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.801903 4990 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.806272 4990 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.809029 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.809002 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.810142 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.810003 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810870 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810903 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810913 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810923 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810937 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810946 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810956 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810971 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810982 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.810995 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.811009 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.811019 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.813541 4990 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.814135 4990 server.go:1280] "Started kubelet" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.815103 4990 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.815322 4990 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.815961 4990 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 09:43:38 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.816738 4990 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.817956 4990 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.818002 4990 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.818093 4990 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.818077 4990 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 17:49:02.652259706 +0000 UTC Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.818129 4990 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 920h5m23.83413325s for next certificate rotation Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.818134 4990 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.818118 4990 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.818199 4990 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.818737 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.818812 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.819345 4990 factory.go:55] Registering systemd factory Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.819374 4990 factory.go:221] Registration of the systemd container factory successfully Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.819752 4990 factory.go:153] Registering CRI-O factory Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.819794 4990 factory.go:221] Registration of the crio container factory successfully Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.819865 4990 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.819920 4990 factory.go:103] Registering Raw factory Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.819944 4990 manager.go:1196] Started watching for new ooms in manager Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.820499 4990 server.go:460] "Adding debug handlers to kubelet server" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.820727 4990 manager.go:319] Starting recovery of all containers Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.822206 4990 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="200ms" Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.823720 4990 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186af1f04c2d951a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 09:43:38.814100762 +0000 UTC m=+0.610732619,LastTimestamp:2025-10-03 09:43:38.814100762 +0000 UTC m=+0.610732619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827244 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827316 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827352 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827568 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827678 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827710 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827745 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827767 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827800 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827817 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827837 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827864 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827921 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827958 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.827977 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828003 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828026 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828046 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828067 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828086 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828108 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828127 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828147 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828175 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828197 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828219 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828243 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828270 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828290 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828313 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828330 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828357 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828375 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828390 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828409 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828426 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828447 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828463 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828480 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828521 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828544 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828569 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828586 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828602 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828624 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828640 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828662 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828680 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828698 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828720 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828738 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828755 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828834 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828869 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828894 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.828915 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829120 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829139 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829162 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829179 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829198 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829217 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829234 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829253 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829267 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829280 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829300 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829314 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829335 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829349 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829364 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829381 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829395 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829415 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829430 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829444 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829467 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829481 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829495 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829533 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829549 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829568 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829581 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829594 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829612 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829632 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829650 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829663 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829678 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829695 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829709 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829725 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829741 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829755 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829775 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829791 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829817 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829834 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829848 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829870 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829885 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829906 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829925 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829940 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829972 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.829993 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830008 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830030 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830055 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830079 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830094 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830115 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830137 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830159 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830172 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830193 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830210 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830232 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830247 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830260 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830281 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830295 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830311 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830329 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830347 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830398 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830415 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830429 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830447 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830463 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830483 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830497 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830529 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830551 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830564 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830582 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830597 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830611 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830629 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830642 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830661 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830674 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830686 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830705 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830732 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830743 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830760 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830772 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830791 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830805 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830821 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830842 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830859 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830881 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830895 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830907 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830928 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830940 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830959 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830972 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.830986 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831006 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831020 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831043 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831055 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831069 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831089 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831107 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831125 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831140 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831154 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831173 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831197 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831213 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831230 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831243 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831262 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831274 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831289 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831306 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831325 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831493 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831525 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831550 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831564 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831578 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831596 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.831610 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.832733 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.832760 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.832799 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.832814 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.832833 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.832849 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.832865 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.832887 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838142 4990 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838236 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838261 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838279 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838293 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838306 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838320 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838333 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838345 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838358 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838375 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838389 4990 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838425 4990 reconstruct.go:97] "Volume reconstruction finished" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.838434 4990 reconciler.go:26] "Reconciler: start to sync state" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.845982 4990 manager.go:324] Recovery completed Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.857947 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.861035 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.861094 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.861109 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.862627 4990 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.862649 4990 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.862705 4990 state_mem.go:36] "Initialized new in-memory state store" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.868395 4990 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.870456 4990 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.870501 4990 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.870540 4990 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.870583 4990 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 09:43:38 crc kubenswrapper[4990]: W1003 09:43:38.871231 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.871281 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.887538 4990 policy_none.go:49] "None policy: Start" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.888394 4990 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.888420 4990 state_mem.go:35] "Initializing new in-memory state store" Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.918276 4990 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.937697 4990 manager.go:334] "Starting Device Plugin manager" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.937885 4990 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.937899 4990 server.go:79] "Starting device plugin registration server" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.938422 4990 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.938439 4990 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.938849 4990 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.938928 4990 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.938935 4990 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 09:43:38 crc kubenswrapper[4990]: E1003 09:43:38.948364 4990 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.971600 4990 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.972082 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.973304 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.973407 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.973474 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.973675 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.973955 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.974057 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.974576 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.974613 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.974622 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.974766 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.974921 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975008 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975554 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975665 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975710 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975812 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975832 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975842 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975866 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975890 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.975900 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.976095 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.976735 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.976765 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.976791 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.976826 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.976836 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977160 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977260 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977298 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977567 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977655 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977723 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977900 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977927 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.977943 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.978828 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.978850 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.978859 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.978989 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.979012 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.979712 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.979742 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:38 crc kubenswrapper[4990]: I1003 09:43:38.979751 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.023091 4990 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="400ms" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040233 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040416 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040471 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040553 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040591 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040687 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040748 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040789 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040837 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040896 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040942 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.040977 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.041005 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.041050 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.041086 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.041118 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.045565 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.045639 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.045661 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.045732 4990 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.047092 4990 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.144578 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.144691 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.144755 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.144776 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.144880 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.144905 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.144956 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.144849 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145037 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145002 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145096 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145143 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145326 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145379 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145431 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145389 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145499 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145585 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145625 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145670 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145718 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145765 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145770 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145869 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145722 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145930 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145966 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145841 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.145809 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.146773 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.247467 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.248779 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.248815 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.248827 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.248854 4990 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.249317 4990 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.315064 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.323444 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.349645 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: W1003 09:43:39.367842 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-657c48d6a59a7a031e249e4d5aa13c65a6ea75010b9b1bca2a2f3c14c00070c3 WatchSource:0}: Error finding container 657c48d6a59a7a031e249e4d5aa13c65a6ea75010b9b1bca2a2f3c14c00070c3: Status 404 returned error can't find the container with id 657c48d6a59a7a031e249e4d5aa13c65a6ea75010b9b1bca2a2f3c14c00070c3 Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.367968 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: W1003 09:43:39.373360 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-986831a02d5fd93a393d669c582f3028b73098b10d5896b297a333192c496d0e WatchSource:0}: Error finding container 986831a02d5fd93a393d669c582f3028b73098b10d5896b297a333192c496d0e: Status 404 returned error can't find the container with id 986831a02d5fd93a393d669c582f3028b73098b10d5896b297a333192c496d0e Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.375299 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:39 crc kubenswrapper[4990]: W1003 09:43:39.398451 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0e4c1789e8e1dd7e40421d2f7df2dad78dc99b2523ad64754e1806c5cd1cb24a WatchSource:0}: Error finding container 0e4c1789e8e1dd7e40421d2f7df2dad78dc99b2523ad64754e1806c5cd1cb24a: Status 404 returned error can't find the container with id 0e4c1789e8e1dd7e40421d2f7df2dad78dc99b2523ad64754e1806c5cd1cb24a Oct 03 09:43:39 crc kubenswrapper[4990]: W1003 09:43:39.401052 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-c5cd98a025542349041fac885a30fa4fa67b21aaed37daabab3f41e54b711efb WatchSource:0}: Error finding container c5cd98a025542349041fac885a30fa4fa67b21aaed37daabab3f41e54b711efb: Status 404 returned error can't find the container with id c5cd98a025542349041fac885a30fa4fa67b21aaed37daabab3f41e54b711efb Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.424171 4990 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="800ms" Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.461255 4990 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.146:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186af1f04c2d951a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 09:43:38.814100762 +0000 UTC m=+0.610732619,LastTimestamp:2025-10-03 09:43:38.814100762 +0000 UTC m=+0.610732619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.649420 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.651711 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.651772 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.651785 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.651819 4990 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.652340 4990 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Oct 03 09:43:39 crc kubenswrapper[4990]: W1003 09:43:39.705389 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.705507 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.816007 4990 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:39 crc kubenswrapper[4990]: W1003 09:43:39.860664 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.861013 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.875366 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"657c48d6a59a7a031e249e4d5aa13c65a6ea75010b9b1bca2a2f3c14c00070c3"} Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.876386 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"986831a02d5fd93a393d669c582f3028b73098b10d5896b297a333192c496d0e"} Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.877672 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5cd98a025542349041fac885a30fa4fa67b21aaed37daabab3f41e54b711efb"} Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.878614 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e4c1789e8e1dd7e40421d2f7df2dad78dc99b2523ad64754e1806c5cd1cb24a"} Oct 03 09:43:39 crc kubenswrapper[4990]: I1003 09:43:39.879368 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"32038b2e05d0e18615915ef18bc7943fb0926219607993edeee56fd9b0fcf19f"} Oct 03 09:43:39 crc kubenswrapper[4990]: W1003 09:43:39.960064 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:39 crc kubenswrapper[4990]: E1003 09:43:39.960194 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:40 crc kubenswrapper[4990]: E1003 09:43:40.225402 4990 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="1.6s" Oct 03 09:43:40 crc kubenswrapper[4990]: W1003 09:43:40.243238 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:40 crc kubenswrapper[4990]: E1003 09:43:40.243349 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.453433 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.455413 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.455465 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.455477 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.455524 4990 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 09:43:40 crc kubenswrapper[4990]: E1003 09:43:40.456062 4990 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.816400 4990 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.884251 4990 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="20aa11ec3b577943d4ac7e4ee6f9957b624ca3f8dcdd9a5a0e97eaceaea77b69" exitCode=0 Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.884344 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"20aa11ec3b577943d4ac7e4ee6f9957b624ca3f8dcdd9a5a0e97eaceaea77b69"} Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.884538 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.886397 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.886464 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.886489 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.888633 4990 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91" exitCode=0 Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.888765 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91"} Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.888790 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.890246 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.890280 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.890311 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.891157 4990 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5ad7b13c1a3cf6448c85dd04260be9988470ed49fe01662098de37328a437cb2" exitCode=0 Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.891219 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.891253 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5ad7b13c1a3cf6448c85dd04260be9988470ed49fe01662098de37328a437cb2"} Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.892489 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.892532 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.892542 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.896852 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f"} Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.896896 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.896901 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90"} Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.897033 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae"} Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.897060 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423"} Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.898094 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.898123 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.898136 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.901658 4990 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4" exitCode=0 Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.901708 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4"} Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.901827 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.902690 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.902738 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.902751 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.907294 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.908637 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.908698 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:40 crc kubenswrapper[4990]: I1003 09:43:40.908726 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:41 crc kubenswrapper[4990]: W1003 09:43:41.385198 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:41 crc kubenswrapper[4990]: E1003 09:43:41.385282 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.816330 4990 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:41 crc kubenswrapper[4990]: E1003 09:43:41.825933 4990 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.146:6443: connect: connection refused" interval="3.2s" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.907038 4990 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ec70fb45dbbd36c05738897bba2dd2095435352a11d460d314f2e7d98d2cb760" exitCode=0 Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.907155 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ec70fb45dbbd36c05738897bba2dd2095435352a11d460d314f2e7d98d2cb760"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.907206 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.908269 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.908309 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.908323 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.910446 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.910501 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.910533 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.910657 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.911927 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.911967 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.911981 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.913825 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"75c2a47a8d54b075957c28219d4ecb914571a69c2661168668d2942fc86d325e"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.913892 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.916903 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.916939 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.916952 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.920707 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.920756 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.920768 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.920779 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484"} Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.920798 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.921859 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.921887 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:41 crc kubenswrapper[4990]: I1003 09:43:41.921898 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.056890 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.058366 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.058406 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.058417 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.058442 4990 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 09:43:42 crc kubenswrapper[4990]: E1003 09:43:42.059009 4990 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.146:6443: connect: connection refused" node="crc" Oct 03 09:43:42 crc kubenswrapper[4990]: W1003 09:43:42.162833 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:42 crc kubenswrapper[4990]: E1003 09:43:42.163025 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:42 crc kubenswrapper[4990]: W1003 09:43:42.429456 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.146:6443: connect: connection refused Oct 03 09:43:42 crc kubenswrapper[4990]: E1003 09:43:42.429598 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.146:6443: connect: connection refused" logger="UnhandledError" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.926461 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9751dc980be14394f13bdf6e8e6469a4d8bb5d54820d22c98201a871a93d0518"} Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.927173 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.928155 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.928196 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.928210 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.931123 4990 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="308eff44f885285df534c2284fbb648cf153e19b278a0118aa1919a0023c2ccf" exitCode=0 Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.931204 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"308eff44f885285df534c2284fbb648cf153e19b278a0118aa1919a0023c2ccf"} Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.931223 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.931276 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.931293 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.931343 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933387 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933420 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933391 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933451 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933429 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933460 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933720 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933740 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:42 crc kubenswrapper[4990]: I1003 09:43:42.933749 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.939897 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0f6559c9c34bd974dcf444b5fa990fa08a907be27c02127f62eb596eb9681ced"} Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.940008 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3ed147b645b327b65285c97150dca319273b47280573f20224ca93f077e29edd"} Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.940030 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0f79d143cea8c5d50d3a6e5e0d43f33bc6d323d0923446038b0d2c0d633b8357"} Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.940044 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6cd6eaf2ae0043ac4cb8af70b90e32d780f33b3562c4befaa4761353f1d6dadd"} Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.939955 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.940118 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.941231 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.941285 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:43 crc kubenswrapper[4990]: I1003 09:43:43.941296 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.153461 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.949721 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1bc9e902c28c7f47e19b2a7fffff7071e4ab72a62554ac5eb14a2b983e820726"} Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.949737 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.950299 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.949898 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.951939 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.951982 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.951993 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.952412 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.952460 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:44 crc kubenswrapper[4990]: I1003 09:43:44.952475 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.259419 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.261227 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.261280 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.261299 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.261401 4990 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.483283 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.483480 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.485095 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.485147 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.485159 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.951725 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.952983 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.953014 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:45 crc kubenswrapper[4990]: I1003 09:43:45.953023 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.694925 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.695182 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.696766 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.696815 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.696832 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.702152 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.958578 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.960399 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.960439 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:46 crc kubenswrapper[4990]: I1003 09:43:46.960453 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.372386 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.372663 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.372729 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.374670 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.374721 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.374737 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.496278 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.496463 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.497843 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.497886 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:47 crc kubenswrapper[4990]: I1003 09:43:47.497901 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.514554 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.514803 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.516291 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.516329 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.516342 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.861270 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.861493 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.863049 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.863082 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.863093 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.907702 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.907890 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.909325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.909430 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:48 crc kubenswrapper[4990]: I1003 09:43:48.909448 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:48 crc kubenswrapper[4990]: E1003 09:43:48.948492 4990 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.070585 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.070868 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.072254 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.072290 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.072302 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.075039 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.967409 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.968395 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.968442 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:49 crc kubenswrapper[4990]: I1003 09:43:49.968456 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:50 crc kubenswrapper[4990]: I1003 09:43:50.687541 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:50 crc kubenswrapper[4990]: I1003 09:43:50.970435 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:50 crc kubenswrapper[4990]: I1003 09:43:50.971766 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:50 crc kubenswrapper[4990]: I1003 09:43:50.971810 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:50 crc kubenswrapper[4990]: I1003 09:43:50.971820 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.070798 4990 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.070907 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.816288 4990 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 03 09:43:52 crc kubenswrapper[4990]: W1003 09:43:52.856061 4990 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.856152 4990 trace.go:236] Trace[481082220]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 09:43:42.854) (total time: 10001ms): Oct 03 09:43:52 crc kubenswrapper[4990]: Trace[481082220]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:43:52.856) Oct 03 09:43:52 crc kubenswrapper[4990]: Trace[481082220]: [10.001261547s] [10.001261547s] END Oct 03 09:43:52 crc kubenswrapper[4990]: E1003 09:43:52.856178 4990 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.976765 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.983349 4990 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9751dc980be14394f13bdf6e8e6469a4d8bb5d54820d22c98201a871a93d0518" exitCode=255 Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.983418 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9751dc980be14394f13bdf6e8e6469a4d8bb5d54820d22c98201a871a93d0518"} Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.983708 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.984662 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.984720 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.984734 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:52 crc kubenswrapper[4990]: I1003 09:43:52.985485 4990 scope.go:117] "RemoveContainer" containerID="9751dc980be14394f13bdf6e8e6469a4d8bb5d54820d22c98201a871a93d0518" Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.176116 4990 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.176194 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.182825 4990 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.182893 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.988501 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.990878 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758"} Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.991130 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.992178 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.992232 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:53 crc kubenswrapper[4990]: I1003 09:43:53.992249 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:57 crc kubenswrapper[4990]: I1003 09:43:57.376785 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:57 crc kubenswrapper[4990]: I1003 09:43:57.376951 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:57 crc kubenswrapper[4990]: I1003 09:43:57.377022 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:57 crc kubenswrapper[4990]: I1003 09:43:57.378088 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:57 crc kubenswrapper[4990]: I1003 09:43:57.378119 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:57 crc kubenswrapper[4990]: I1003 09:43:57.378131 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:57 crc kubenswrapper[4990]: I1003 09:43:57.381926 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.000459 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.001238 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.001262 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.001270 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:58 crc kubenswrapper[4990]: E1003 09:43:58.171787 4990 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.174552 4990 trace.go:236] Trace[305224727]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 09:43:47.150) (total time: 11023ms): Oct 03 09:43:58 crc kubenswrapper[4990]: Trace[305224727]: ---"Objects listed" error: 11023ms (09:43:58.174) Oct 03 09:43:58 crc kubenswrapper[4990]: Trace[305224727]: [11.023704056s] [11.023704056s] END Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.174591 4990 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.174721 4990 trace.go:236] Trace[2081111113]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 09:43:46.304) (total time: 11870ms): Oct 03 09:43:58 crc kubenswrapper[4990]: Trace[2081111113]: ---"Objects listed" error: 11870ms (09:43:58.174) Oct 03 09:43:58 crc kubenswrapper[4990]: Trace[2081111113]: [11.870555291s] [11.870555291s] END Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.174740 4990 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 09:43:58 crc kubenswrapper[4990]: E1003 09:43:58.175496 4990 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.177838 4990 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.179667 4990 trace.go:236] Trace[1176454449]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 09:43:47.711) (total time: 10467ms): Oct 03 09:43:58 crc kubenswrapper[4990]: Trace[1176454449]: ---"Objects listed" error: 10467ms (09:43:58.179) Oct 03 09:43:58 crc kubenswrapper[4990]: Trace[1176454449]: [10.467861314s] [10.467861314s] END Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.179715 4990 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.889097 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.889305 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.890357 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.890379 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.890387 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:58 crc kubenswrapper[4990]: I1003 09:43:58.902644 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 09:43:58 crc kubenswrapper[4990]: E1003 09:43:58.948635 4990 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.002483 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.002692 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.004115 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.004163 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.004164 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.004172 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.004192 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.004256 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.068006 4990 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.074683 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.078176 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.809496 4990 apiserver.go:52] "Watching apiserver" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.811785 4990 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.812095 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-lrqf5"] Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.812565 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.812607 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.812618 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.812653 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.812679 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.812671 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.812857 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.812981 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.813096 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.813209 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lrqf5" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.814804 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.815731 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.815780 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.815920 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.816044 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.816419 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.816499 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.816553 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.816684 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.816735 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.817057 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.818375 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.819167 4990 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.829671 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.839219 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.849161 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.859110 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.872473 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.881703 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890001 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890063 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890090 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890117 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890144 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890211 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890233 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890255 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890280 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890301 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890326 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890346 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890366 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890388 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.890409 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.893677 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.893787 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.893856 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.893906 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.893963 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894000 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894038 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894070 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894098 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894146 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894187 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894217 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894260 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894302 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894339 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894375 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894434 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894474 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894548 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894597 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894635 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894673 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894715 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894799 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894829 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894862 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894900 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894925 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894955 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894983 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895010 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895037 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895065 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895095 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895124 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895157 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895183 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895209 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895238 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895268 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895293 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895321 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895348 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895379 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895402 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895432 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895463 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895489 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895540 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895744 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895781 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895822 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895857 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895893 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895922 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895951 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895980 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896005 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896039 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896066 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896091 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896119 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896151 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896608 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896640 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896669 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896700 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896724 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896752 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896784 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896811 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896843 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896871 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896898 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896930 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896959 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896994 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897036 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897075 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897108 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897132 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897159 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897187 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897215 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897241 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897273 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897304 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897331 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897361 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897388 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897412 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897442 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897469 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897500 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897660 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897698 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897732 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897759 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897792 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897823 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897850 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897874 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897903 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897930 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897955 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897983 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.893647 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.898105 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.893649 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.893944 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894090 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.894839 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895354 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895401 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895755 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.898244 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895916 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.895982 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896377 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896489 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896701 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896839 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896908 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.896969 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897274 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897393 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.898366 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897684 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897972 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.897997 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.898544 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.898562 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.898698 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.898012 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899360 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899348 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899381 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899397 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899492 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899630 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899771 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899852 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899950 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.899990 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900221 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900273 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900440 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900605 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900639 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900671 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900700 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900825 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900862 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900889 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900916 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900940 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900963 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.900984 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901007 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901031 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901051 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901074 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901097 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901123 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901146 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901173 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901201 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901225 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901251 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901279 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901303 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901334 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901384 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901412 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901436 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901466 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901494 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901568 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901576 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901689 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901724 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901758 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901787 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901817 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901845 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901871 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901897 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901920 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901950 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.901979 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902003 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902031 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902047 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902057 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902160 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902160 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902309 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902341 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902390 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902424 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902459 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902499 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902549 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902601 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902628 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902660 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902690 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902725 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902752 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902785 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902538 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902818 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902918 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902974 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903028 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903073 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903117 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903158 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903198 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903242 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903281 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903325 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903366 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903476 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903559 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903626 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42f59e4b-517b-444e-8df2-5b8dae4d5d67-hosts-file\") pod \"node-resolver-lrqf5\" (UID: \"42f59e4b-517b-444e-8df2-5b8dae4d5d67\") " pod="openshift-dns/node-resolver-lrqf5" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903675 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.904933 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.904994 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905031 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905070 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nchtd\" (UniqueName: \"kubernetes.io/projected/42f59e4b-517b-444e-8df2-5b8dae4d5d67-kube-api-access-nchtd\") pod \"node-resolver-lrqf5\" (UID: \"42f59e4b-517b-444e-8df2-5b8dae4d5d67\") " pod="openshift-dns/node-resolver-lrqf5" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905104 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905135 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905165 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905195 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905224 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905258 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905291 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905321 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905427 4990 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905451 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905474 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905495 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905565 4990 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905589 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905609 4990 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905629 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905652 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905668 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905684 4990 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905700 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905718 4990 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905732 4990 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905748 4990 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905767 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905783 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905797 4990 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905812 4990 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905830 4990 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905845 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905861 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905878 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905897 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905911 4990 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905926 4990 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905944 4990 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905977 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905995 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906010 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906028 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906049 4990 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906069 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906088 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906113 4990 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906135 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906155 4990 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906177 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906205 4990 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906226 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906247 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906273 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.914419 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.916053 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-68v62"] Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.916833 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bspdz"] Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.902924 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.903257 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.904382 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.904496 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.904815 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.904880 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.904933 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905083 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905249 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905385 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905594 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905658 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.906402 4990 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.918100 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:00.418075895 +0000 UTC m=+22.214707742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.917027 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-gb69z"] Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.934440 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.934798 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927360 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.925100 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.925232 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906880 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906955 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.907093 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.907391 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.907452 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.907871 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.908182 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.905436 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.908544 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.908703 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.908957 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.910592 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.910725 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927271 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bspdz" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928996 4990 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.938940 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.935499 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.910829 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.911034 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.911661 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.912102 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.912135 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.912243 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.912256 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.912281 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.912301 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.913723 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.914082 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.914649 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.925731 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.925861 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.925918 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.926218 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.926399 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.926554 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.926735 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.926891 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.926902 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927078 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927096 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927104 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927043 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927151 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927186 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.927684 4990 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.940258 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:00.44024007 +0000 UTC m=+22.236871927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.927989 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928161 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928197 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928024 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928239 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928648 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928711 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928841 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.928857 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.929191 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.929234 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.906814 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.929478 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.929674 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.929852 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.929980 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.930153 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.930572 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.931098 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.931257 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.931872 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.931884 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.931975 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.932148 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.932166 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.933652 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.933687 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.933715 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.933868 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.933883 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.934532 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.934544 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.935029 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.935186 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.936622 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.937607 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.937808 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.938113 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.938040 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.938400 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.938845 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.940646 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:44:00.44063895 +0000 UTC m=+22.237270807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.940897 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.941056 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.941871 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.942047 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.941986 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.942193 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.942678 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.942843 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.942922 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.942977 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.943132 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.943241 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.943362 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.943404 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.943595 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.943764 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.944956 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.945187 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.945751 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.945837 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.946042 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.946047 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.945900 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.946182 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.946422 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.946476 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.947057 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.947056 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.947151 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.947434 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.947611 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.947638 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.948201 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.949067 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.949177 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.949422 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.949539 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.951361 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.952072 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.952585 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.952646 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.952979 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.953130 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.955738 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.956914 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.956907 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.957216 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.957317 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.957671 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.957697 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.959255 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.957386 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.961793 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.962262 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.962269 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.962491 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.962520 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.962566 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.962768 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.963065 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.963201 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.963270 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.965201 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.968230 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.968335 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.971438 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.975056 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.976903 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.977592 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.979882 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.979909 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.979921 4990 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.979977 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:00.479957941 +0000 UTC m=+22.276589798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.980148 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.980159 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.980167 4990 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:43:59 crc kubenswrapper[4990]: E1003 09:43:59.980190 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:00.480184117 +0000 UTC m=+22.276815974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:43:59 crc kubenswrapper[4990]: I1003 09:43:59.982853 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:43:59.998752 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007386 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007432 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007460 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-system-cni-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007480 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-cnibin\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007499 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-cni-multus\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007738 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21ea38c-26da-4987-a50d-bafecdfbbd02-proxy-tls\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007757 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-cni-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007776 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-hostroot\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007795 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/31671a76-378e-4899-89ae-d27e608c3cda-multus-daemon-config\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007815 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f21ea38c-26da-4987-a50d-bafecdfbbd02-rootfs\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007835 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-os-release\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007857 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-netns\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007878 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-cni-bin\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007898 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjq26\" (UniqueName: \"kubernetes.io/projected/31671a76-378e-4899-89ae-d27e608c3cda-kube-api-access-zjq26\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007926 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31671a76-378e-4899-89ae-d27e608c3cda-cni-binary-copy\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007945 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-socket-dir-parent\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007965 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-kubelet\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.007984 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-multus-certs\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008021 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42f59e4b-517b-444e-8df2-5b8dae4d5d67-hosts-file\") pod \"node-resolver-lrqf5\" (UID: \"42f59e4b-517b-444e-8df2-5b8dae4d5d67\") " pod="openshift-dns/node-resolver-lrqf5" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008040 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-k8s-cni-cncf-io\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008058 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-conf-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008079 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nchtd\" (UniqueName: \"kubernetes.io/projected/42f59e4b-517b-444e-8df2-5b8dae4d5d67-kube-api-access-nchtd\") pod \"node-resolver-lrqf5\" (UID: \"42f59e4b-517b-444e-8df2-5b8dae4d5d67\") " pod="openshift-dns/node-resolver-lrqf5" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008100 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f21ea38c-26da-4987-a50d-bafecdfbbd02-mcd-auth-proxy-config\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008134 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bmw6\" (UniqueName: \"kubernetes.io/projected/f21ea38c-26da-4987-a50d-bafecdfbbd02-kube-api-access-5bmw6\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008155 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-etc-kubernetes\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008223 4990 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008236 4990 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008247 4990 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008259 4990 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008270 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008285 4990 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008298 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008311 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008322 4990 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008334 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008346 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008357 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008369 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008381 4990 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008393 4990 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008404 4990 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008416 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008427 4990 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008438 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008449 4990 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008463 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008475 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008487 4990 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008497 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008521 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008533 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008545 4990 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008556 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008568 4990 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008579 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008591 4990 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008602 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008614 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008625 4990 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008637 4990 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008649 4990 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008660 4990 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008670 4990 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008681 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008694 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008705 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008715 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008725 4990 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008735 4990 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008746 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008757 4990 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008767 4990 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008779 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008790 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008802 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008812 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008823 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008834 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008845 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008857 4990 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008868 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008879 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008892 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008903 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008914 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008926 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008938 4990 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008950 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008962 4990 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008973 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008984 4990 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008994 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009006 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009017 4990 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009028 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009039 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009049 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009060 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009070 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009081 4990 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009091 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009101 4990 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009112 4990 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009122 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009139 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009149 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009159 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009170 4990 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009181 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009191 4990 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009202 4990 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009212 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009222 4990 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009232 4990 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009241 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.008627 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009251 4990 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009279 4990 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009292 4990 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009303 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009313 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009323 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009334 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009346 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009356 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009366 4990 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009377 4990 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009387 4990 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009397 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009406 4990 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009425 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009435 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009446 4990 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009456 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009497 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009522 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009533 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009544 4990 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009559 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009571 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009580 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009590 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009600 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009611 4990 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009624 4990 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009634 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009646 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009658 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009669 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009679 4990 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009691 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009712 4990 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009726 4990 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009736 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009746 4990 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009765 4990 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009776 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009785 4990 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009305 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009920 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/42f59e4b-517b-444e-8df2-5b8dae4d5d67-hosts-file\") pod \"node-resolver-lrqf5\" (UID: \"42f59e4b-517b-444e-8df2-5b8dae4d5d67\") " pod="openshift-dns/node-resolver-lrqf5" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.009793 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010137 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010147 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010156 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010164 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010263 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010275 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010283 4990 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010291 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010298 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010307 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010315 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010323 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010331 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010339 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010348 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.010357 4990 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.013734 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.014878 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.016132 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.026302 4990 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758" exitCode=255 Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.027082 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758"} Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.027165 4990 scope.go:117] "RemoveContainer" containerID="9751dc980be14394f13bdf6e8e6469a4d8bb5d54820d22c98201a871a93d0518" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.031385 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.030364 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.037752 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.045420 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nchtd\" (UniqueName: \"kubernetes.io/projected/42f59e4b-517b-444e-8df2-5b8dae4d5d67-kube-api-access-nchtd\") pod \"node-resolver-lrqf5\" (UID: \"42f59e4b-517b-444e-8df2-5b8dae4d5d67\") " pod="openshift-dns/node-resolver-lrqf5" Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.051775 4990 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.055897 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.066702 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.068564 4990 scope.go:117] "RemoveContainer" containerID="08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758" Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.068780 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.081989 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.107037 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.110880 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-netns\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.110927 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-socket-dir-parent\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.110953 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-multus-certs\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.110974 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjq26\" (UniqueName: \"kubernetes.io/projected/31671a76-378e-4899-89ae-d27e608c3cda-kube-api-access-zjq26\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.110998 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-system-cni-dir\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111016 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-conf-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111039 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f21ea38c-26da-4987-a50d-bafecdfbbd02-mcd-auth-proxy-config\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111068 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111089 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6ppc\" (UniqueName: \"kubernetes.io/projected/aac0cf74-c31a-4c75-8810-556b8e787c9c-kube-api-access-r6ppc\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111104 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-etc-kubernetes\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111120 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bmw6\" (UniqueName: \"kubernetes.io/projected/f21ea38c-26da-4987-a50d-bafecdfbbd02-kube-api-access-5bmw6\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111135 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-cnibin\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111150 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-cni-multus\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111165 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aac0cf74-c31a-4c75-8810-556b8e787c9c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111182 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-system-cni-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111201 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-hostroot\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111219 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f21ea38c-26da-4987-a50d-bafecdfbbd02-rootfs\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111232 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-os-release\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111249 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-cni-bin\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111263 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-os-release\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111278 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31671a76-378e-4899-89ae-d27e608c3cda-cni-binary-copy\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111292 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-kubelet\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111307 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aac0cf74-c31a-4c75-8810-556b8e787c9c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111325 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-k8s-cni-cncf-io\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111356 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21ea38c-26da-4987-a50d-bafecdfbbd02-proxy-tls\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111370 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-cni-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111384 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/31671a76-378e-4899-89ae-d27e608c3cda-multus-daemon-config\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111400 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-cnibin\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111423 4990 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111434 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111443 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111526 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-netns\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111577 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-system-cni-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111788 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-hostroot\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111832 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f21ea38c-26da-4987-a50d-bafecdfbbd02-rootfs\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111876 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-os-release\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.111897 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-cni-bin\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112020 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-socket-dir-parent\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112053 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-multus-certs\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112151 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-etc-kubernetes\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112363 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-conf-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112470 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31671a76-378e-4899-89ae-d27e608c3cda-cni-binary-copy\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112531 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-kubelet\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112550 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-cnibin\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112599 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-var-lib-cni-multus\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112724 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-multus-cni-dir\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.112781 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/31671a76-378e-4899-89ae-d27e608c3cda-host-run-k8s-cni-cncf-io\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.113070 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f21ea38c-26da-4987-a50d-bafecdfbbd02-mcd-auth-proxy-config\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.113327 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/31671a76-378e-4899-89ae-d27e608c3cda-multus-daemon-config\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.116582 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21ea38c-26da-4987-a50d-bafecdfbbd02-proxy-tls\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.126647 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.130770 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.137036 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.138638 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjq26\" (UniqueName: \"kubernetes.io/projected/31671a76-378e-4899-89ae-d27e608c3cda-kube-api-access-zjq26\") pod \"multus-bspdz\" (UID: \"31671a76-378e-4899-89ae-d27e608c3cda\") " pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.140874 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.142901 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.143557 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bmw6\" (UniqueName: \"kubernetes.io/projected/f21ea38c-26da-4987-a50d-bafecdfbbd02-kube-api-access-5bmw6\") pod \"machine-config-daemon-68v62\" (UID: \"f21ea38c-26da-4987-a50d-bafecdfbbd02\") " pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.149358 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lrqf5" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.151403 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: W1003 09:44:00.153664 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-22a363a17e7f40ac12119a4bc121e24f81d9143489e3e7ddd68bb71ee8fb5f42 WatchSource:0}: Error finding container 22a363a17e7f40ac12119a4bc121e24f81d9143489e3e7ddd68bb71ee8fb5f42: Status 404 returned error can't find the container with id 22a363a17e7f40ac12119a4bc121e24f81d9143489e3e7ddd68bb71ee8fb5f42 Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.165851 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.175629 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.187632 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.197212 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.209225 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.211967 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-cnibin\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212011 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-system-cni-dir\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212046 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212068 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6ppc\" (UniqueName: \"kubernetes.io/projected/aac0cf74-c31a-4c75-8810-556b8e787c9c-kube-api-access-r6ppc\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212094 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aac0cf74-c31a-4c75-8810-556b8e787c9c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212121 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-os-release\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212140 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aac0cf74-c31a-4c75-8810-556b8e787c9c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212526 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-cnibin\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212584 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-system-cni-dir\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212834 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-os-release\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.212879 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aac0cf74-c31a-4c75-8810-556b8e787c9c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.213094 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aac0cf74-c31a-4c75-8810-556b8e787c9c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.213745 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aac0cf74-c31a-4c75-8810-556b8e787c9c-cni-binary-copy\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.227624 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.237108 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6ppc\" (UniqueName: \"kubernetes.io/projected/aac0cf74-c31a-4c75-8810-556b8e787c9c-kube-api-access-r6ppc\") pod \"multus-additional-cni-plugins-gb69z\" (UID: \"aac0cf74-c31a-4c75-8810-556b8e787c9c\") " pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.282483 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gb69z" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.306772 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7rqmg"] Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.308565 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.311138 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.311366 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.311472 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.311707 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.311915 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.312237 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.313030 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.318475 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.318518 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bspdz" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.321703 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.332462 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.344559 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.352235 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.364345 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.375161 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.386656 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.409121 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413413 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-etc-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413479 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-node-log\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413558 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-systemd-units\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413593 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-config\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413625 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-kubelet\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413683 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-systemd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413711 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-ovn\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413747 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-slash\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413779 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413826 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413857 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-script-lib\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413891 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-env-overrides\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413921 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413966 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-netns\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.413996 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovn-node-metrics-cert\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.414037 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-var-lib-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.414068 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-log-socket\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.414109 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr7pd\" (UniqueName: \"kubernetes.io/projected/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-kube-api-access-dr7pd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.414141 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-bin\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.414173 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-netd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.429608 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9751dc980be14394f13bdf6e8e6469a4d8bb5d54820d22c98201a871a93d0518\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:52Z\\\",\\\"message\\\":\\\"W1003 09:43:42.060827 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 09:43:42.061424 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759484622 cert, and key in /tmp/serving-cert-3719107380/serving-signer.crt, /tmp/serving-cert-3719107380/serving-signer.key\\\\nI1003 09:43:42.437953 1 observer_polling.go:159] Starting file observer\\\\nW1003 09:43:42.440563 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 09:43:42.440776 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:42.442416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3719107380/tls.crt::/tmp/serving-cert-3719107380/tls.key\\\\\\\"\\\\nF1003 09:43:52.604788 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.444108 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.457639 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.475663 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.488766 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514500 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514635 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-env-overrides\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514657 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.514683 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:44:01.514664162 +0000 UTC m=+23.311296019 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514715 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514707 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514748 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-netns\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514771 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514789 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-var-lib-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514805 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovn-node-metrics-cert\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514821 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-log-socket\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514838 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr7pd\" (UniqueName: \"kubernetes.io/projected/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-kube-api-access-dr7pd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514860 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-bin\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514882 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-netd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514905 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-etc-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514924 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-node-log\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514945 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514963 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-systemd-units\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.514984 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-kubelet\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515000 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-config\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515017 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-systemd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515034 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-ovn\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515052 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-slash\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515069 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515089 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515106 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515122 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-script-lib\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515152 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-bin\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.515318 4990 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.515386 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:01.51535818 +0000 UTC m=+23.311990037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515410 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-env-overrides\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515430 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-netns\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.515532 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.515547 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.515560 4990 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.515592 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:01.515583406 +0000 UTC m=+23.312215323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515623 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-var-lib-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515682 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-script-lib\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515726 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-systemd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515757 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-ovn\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515780 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-slash\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.515817 4990 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.515850 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:01.515842433 +0000 UTC m=+23.312474290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515875 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515896 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515919 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-log-socket\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515941 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-node-log\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515962 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-netd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.515984 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-etc-openvswitch\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.516118 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-config\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.516193 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-systemd-units\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.516222 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.516240 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.516251 4990 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.516296 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:01.516287235 +0000 UTC m=+23.312919092 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.516225 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-kubelet\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.522025 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovn-node-metrics-cert\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.534225 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr7pd\" (UniqueName: \"kubernetes.io/projected/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-kube-api-access-dr7pd\") pod \"ovnkube-node-7rqmg\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.741778 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:00 crc kubenswrapper[4990]: W1003 09:44:00.753490 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ae9c9fa_1ce9_42dc_aa7f_e6f80d17a45c.slice/crio-25da2c61b63a282d8c2aca37731a7c787e40adb4e1b679f798cce85d5cfb39d3 WatchSource:0}: Error finding container 25da2c61b63a282d8c2aca37731a7c787e40adb4e1b679f798cce85d5cfb39d3: Status 404 returned error can't find the container with id 25da2c61b63a282d8c2aca37731a7c787e40adb4e1b679f798cce85d5cfb39d3 Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.871027 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:00 crc kubenswrapper[4990]: E1003 09:44:00.871217 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.875920 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.876673 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.878431 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.879259 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.880373 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.880899 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.881591 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.882619 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.883297 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.884270 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.884815 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.886119 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.886903 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.887675 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.888901 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.889649 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.892777 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.894011 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.895723 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.896906 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.898564 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.899663 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.900360 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.902216 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.902967 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.904768 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.905807 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.906815 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.907462 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.908396 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.908893 4990 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.909001 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.911575 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.912417 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.912911 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.914714 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.915804 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.916455 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.917566 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.918250 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.919131 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.919767 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.920887 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.922019 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.922618 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.923159 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.924224 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.925450 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.926000 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.926943 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.927416 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.927965 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.928975 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 09:44:00 crc kubenswrapper[4990]: I1003 09:44:00.929507 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.034407 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.034454 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"22a363a17e7f40ac12119a4bc121e24f81d9143489e3e7ddd68bb71ee8fb5f42"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.035961 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f" exitCode=0 Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.036015 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.036104 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"25da2c61b63a282d8c2aca37731a7c787e40adb4e1b679f798cce85d5cfb39d3"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.038276 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"115c0c0b5d85027e5282c4dcb4c126039ddb74eadc16d2c14ae50451e6d61707"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.040448 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bspdz" event={"ID":"31671a76-378e-4899-89ae-d27e608c3cda","Type":"ContainerStarted","Data":"8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.040526 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bspdz" event={"ID":"31671a76-378e-4899-89ae-d27e608c3cda","Type":"ContainerStarted","Data":"26e5acca0b666792ef06b0fce0cc90392fb2e399d845406dfab5e02b0ab14fa3"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.042317 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lrqf5" event={"ID":"42f59e4b-517b-444e-8df2-5b8dae4d5d67","Type":"ContainerStarted","Data":"08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.042350 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lrqf5" event={"ID":"42f59e4b-517b-444e-8df2-5b8dae4d5d67","Type":"ContainerStarted","Data":"d9e9194154847f897e445100630419505cfaed965113447ed22ab20cee4b7e55"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.043982 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.048042 4990 scope.go:117] "RemoveContainer" containerID="08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758" Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.048227 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.050362 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.050409 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.050423 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"0e8fc564f6eef89aa68cf04a01484d9440508784236d6463bfb05b193b3f4566"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.052366 4990 generic.go:334] "Generic (PLEG): container finished" podID="aac0cf74-c31a-4c75-8810-556b8e787c9c" containerID="e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62" exitCode=0 Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.052422 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" event={"ID":"aac0cf74-c31a-4c75-8810-556b8e787c9c","Type":"ContainerDied","Data":"e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.052452 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" event={"ID":"aac0cf74-c31a-4c75-8810-556b8e787c9c","Type":"ContainerStarted","Data":"d8bee822fbf87ab1ec1f54c7bde5b3ef4d6be6327bbb9cc3683e993ecae2ed0b"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.053937 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.055080 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.055122 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.055139 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d0e91b61366c08db81fb6bfb4b8fedaae33f24144b6e4f88a0a2e06759fbc07c"} Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.070748 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.085108 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.114718 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.130685 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9751dc980be14394f13bdf6e8e6469a4d8bb5d54820d22c98201a871a93d0518\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:52Z\\\",\\\"message\\\":\\\"W1003 09:43:42.060827 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1003 09:43:42.061424 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759484622 cert, and key in /tmp/serving-cert-3719107380/serving-signer.crt, /tmp/serving-cert-3719107380/serving-signer.key\\\\nI1003 09:43:42.437953 1 observer_polling.go:159] Starting file observer\\\\nW1003 09:43:42.440563 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1003 09:43:42.440776 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:42.442416 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3719107380/tls.crt::/tmp/serving-cert-3719107380/tls.key\\\\\\\"\\\\nF1003 09:43:52.604788 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.145976 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.160901 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.174737 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.191887 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.207893 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.226387 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.241076 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.252908 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.273044 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.288202 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.302836 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.317386 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.329233 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.344914 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.357595 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.368492 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.382980 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.390915 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-j96ms"] Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.391359 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.393396 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.393915 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.393982 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.394083 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.402118 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.413977 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.428144 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.442159 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.462720 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.478600 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.492630 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.510192 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.526226 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.526341 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.526379 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526419 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:44:03.526387597 +0000 UTC m=+25.323019454 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.526473 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56ed72a9-69d0-4f5e-b38b-f91c1221c917-host\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526522 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526546 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526560 4990 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.526528 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56ed72a9-69d0-4f5e-b38b-f91c1221c917-serviceca\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526610 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:03.526592232 +0000 UTC m=+25.323224159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526640 4990 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.526661 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlfj\" (UniqueName: \"kubernetes.io/projected/56ed72a9-69d0-4f5e-b38b-f91c1221c917-kube-api-access-swlfj\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526753 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:03.526729335 +0000 UTC m=+25.323361262 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.526809 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.526881 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526950 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526968 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526978 4990 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.526977 4990 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.527009 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:03.526999662 +0000 UTC m=+25.323631599 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.527032 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:03.527022583 +0000 UTC m=+25.323654440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.529691 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.543941 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.558380 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.571910 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.584203 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.599134 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.623605 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.628047 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swlfj\" (UniqueName: \"kubernetes.io/projected/56ed72a9-69d0-4f5e-b38b-f91c1221c917-kube-api-access-swlfj\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.628148 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56ed72a9-69d0-4f5e-b38b-f91c1221c917-host\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.628172 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56ed72a9-69d0-4f5e-b38b-f91c1221c917-serviceca\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.628233 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/56ed72a9-69d0-4f5e-b38b-f91c1221c917-host\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.629173 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/56ed72a9-69d0-4f5e-b38b-f91c1221c917-serviceca\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.674875 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swlfj\" (UniqueName: \"kubernetes.io/projected/56ed72a9-69d0-4f5e-b38b-f91c1221c917-kube-api-access-swlfj\") pod \"node-ca-j96ms\" (UID: \"56ed72a9-69d0-4f5e-b38b-f91c1221c917\") " pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.680543 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.705315 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j96ms" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.727704 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: W1003 09:44:01.728889 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ed72a9_69d0_4f5e_b38b_f91c1221c917.slice/crio-7e51d53d2403c143de6ef6327065bcf161cb1111eb67f7a936818f400dbc9e79 WatchSource:0}: Error finding container 7e51d53d2403c143de6ef6327065bcf161cb1111eb67f7a936818f400dbc9e79: Status 404 returned error can't find the container with id 7e51d53d2403c143de6ef6327065bcf161cb1111eb67f7a936818f400dbc9e79 Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.764049 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:01Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.871703 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:01 crc kubenswrapper[4990]: I1003 09:44:01.871737 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.872333 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:01 crc kubenswrapper[4990]: E1003 09:44:01.872477 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.061598 4990 generic.go:334] "Generic (PLEG): container finished" podID="aac0cf74-c31a-4c75-8810-556b8e787c9c" containerID="05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2" exitCode=0 Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.061638 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" event={"ID":"aac0cf74-c31a-4c75-8810-556b8e787c9c","Type":"ContainerDied","Data":"05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2"} Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.066498 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.066563 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.066578 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.066588 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.066598 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.068124 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j96ms" event={"ID":"56ed72a9-69d0-4f5e-b38b-f91c1221c917","Type":"ContainerStarted","Data":"7e51d53d2403c143de6ef6327065bcf161cb1111eb67f7a936818f400dbc9e79"} Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.089890 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.108393 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.121823 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.141665 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.158727 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.180874 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.203667 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.219568 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.232032 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.256139 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.273323 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.289723 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.307491 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.320838 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:02Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:02 crc kubenswrapper[4990]: I1003 09:44:02.870978 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:02 crc kubenswrapper[4990]: E1003 09:44:02.871972 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.075548 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.076907 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d"} Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.078441 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j96ms" event={"ID":"56ed72a9-69d0-4f5e-b38b-f91c1221c917","Type":"ContainerStarted","Data":"4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd"} Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.081598 4990 generic.go:334] "Generic (PLEG): container finished" podID="aac0cf74-c31a-4c75-8810-556b8e787c9c" containerID="a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb" exitCode=0 Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.081643 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" event={"ID":"aac0cf74-c31a-4c75-8810-556b8e787c9c","Type":"ContainerDied","Data":"a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb"} Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.090085 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.105078 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.118176 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.132448 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.145336 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.157817 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.173362 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.185995 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.206831 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.222356 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.234455 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.247046 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.259342 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.274854 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.289940 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.300697 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.323046 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.336089 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.351235 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.364190 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.379192 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.394033 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.411690 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.425032 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.439439 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.454540 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.465103 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.483304 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:03Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.546891 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.547005 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.547051 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.547080 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.547104 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547243 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547267 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547280 4990 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547337 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:07.547319311 +0000 UTC m=+29.343951168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547830 4990 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547855 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:44:07.547837845 +0000 UTC m=+29.344469712 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547835 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547893 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:07.547880016 +0000 UTC m=+29.344511873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547900 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547885 4990 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.548011 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:07.547985539 +0000 UTC m=+29.344617406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.547912 4990 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.548076 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:07.548058531 +0000 UTC m=+29.344690498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.870969 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.871120 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:03 crc kubenswrapper[4990]: I1003 09:44:03.870989 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:03 crc kubenswrapper[4990]: E1003 09:44:03.871309 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.088643 4990 generic.go:334] "Generic (PLEG): container finished" podID="aac0cf74-c31a-4c75-8810-556b8e787c9c" containerID="7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26" exitCode=0 Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.088747 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" event={"ID":"aac0cf74-c31a-4c75-8810-556b8e787c9c","Type":"ContainerDied","Data":"7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26"} Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.103369 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.118840 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.135589 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.152857 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.172092 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.188272 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.204290 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.218681 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.241154 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.256065 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.269125 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.281957 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.296204 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.313144 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.576002 4990 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.577947 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.577983 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.577993 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.578048 4990 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.586821 4990 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.587170 4990 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.588332 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.588363 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.588370 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.588389 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.588405 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:04 crc kubenswrapper[4990]: E1003 09:44:04.606434 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.610476 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.610532 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.610544 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.610560 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.610572 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:04 crc kubenswrapper[4990]: E1003 09:44:04.623558 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.627365 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.627403 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.627418 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.627435 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.627447 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:04 crc kubenswrapper[4990]: E1003 09:44:04.643216 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.649808 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.649860 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.649872 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.649893 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.649904 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:04 crc kubenswrapper[4990]: E1003 09:44:04.663289 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.666645 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.666689 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.666698 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.666714 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.666725 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:04 crc kubenswrapper[4990]: E1003 09:44:04.679135 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:04Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:04 crc kubenswrapper[4990]: E1003 09:44:04.679346 4990 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.680760 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.680829 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.680847 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.680871 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.680892 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.783264 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.783315 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.783328 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.783346 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.783388 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.871302 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:04 crc kubenswrapper[4990]: E1003 09:44:04.871479 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.886102 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.886168 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.886189 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.886215 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.886236 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.988662 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.988699 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.988709 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.988726 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:04 crc kubenswrapper[4990]: I1003 09:44:04.988737 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:04Z","lastTransitionTime":"2025-10-03T09:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.091458 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.091493 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.091503 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.091549 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.091562 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.095816 4990 generic.go:334] "Generic (PLEG): container finished" podID="aac0cf74-c31a-4c75-8810-556b8e787c9c" containerID="f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01" exitCode=0 Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.095887 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" event={"ID":"aac0cf74-c31a-4c75-8810-556b8e787c9c","Type":"ContainerDied","Data":"f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.099711 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.112498 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.126091 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.136649 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.151353 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.162815 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.174152 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.185409 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.194371 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.194411 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.194423 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.194442 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.194453 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.199288 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.209776 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.228059 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.240371 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.253022 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.266169 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.279618 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:05Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.297111 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.297156 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.297164 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.297180 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.297191 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.399725 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.399769 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.399779 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.399795 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.399806 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.502482 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.502545 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.502558 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.502580 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.502595 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.608748 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.608811 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.608828 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.608852 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.608867 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.711678 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.711728 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.711745 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.711769 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.711786 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.815116 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.815148 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.815157 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.815171 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.815179 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.842844 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.843648 4990 scope.go:117] "RemoveContainer" containerID="08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758" Oct 03 09:44:05 crc kubenswrapper[4990]: E1003 09:44:05.843803 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.871163 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.871207 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:05 crc kubenswrapper[4990]: E1003 09:44:05.871304 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:05 crc kubenswrapper[4990]: E1003 09:44:05.871446 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.918187 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.918245 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.918260 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.918282 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:05 crc kubenswrapper[4990]: I1003 09:44:05.918300 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:05Z","lastTransitionTime":"2025-10-03T09:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.021780 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.021843 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.021860 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.021880 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.021892 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.107678 4990 generic.go:334] "Generic (PLEG): container finished" podID="aac0cf74-c31a-4c75-8810-556b8e787c9c" containerID="cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee" exitCode=0 Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.107763 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" event={"ID":"aac0cf74-c31a-4c75-8810-556b8e787c9c","Type":"ContainerDied","Data":"cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.125466 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.125551 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.125569 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.125593 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.125611 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.127357 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.163106 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.197904 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.215898 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.228303 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.228351 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.228360 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.228376 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.228389 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.231883 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.245947 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.266754 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.280239 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.293036 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.305093 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.319943 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.331153 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.331193 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.331204 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.331221 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.331233 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.333313 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.345727 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.356920 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:06Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.433261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.433301 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.433312 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.433329 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.433341 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.536869 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.537201 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.537212 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.537239 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.537251 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.639929 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.639971 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.639984 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.640001 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.640012 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.742914 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.742990 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.743017 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.743046 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.743067 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.845683 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.845713 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.845724 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.845741 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.845760 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.871307 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:06 crc kubenswrapper[4990]: E1003 09:44:06.871457 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.948843 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.948903 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.948932 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.948958 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:06 crc kubenswrapper[4990]: I1003 09:44:06.948970 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:06Z","lastTransitionTime":"2025-10-03T09:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.052449 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.052538 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.052551 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.052572 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.052585 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.117747 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" event={"ID":"aac0cf74-c31a-4c75-8810-556b8e787c9c","Type":"ContainerStarted","Data":"03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.123102 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.123381 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.123501 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.123524 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.139630 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.151476 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.154639 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.155468 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.155938 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.155979 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.155995 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.156019 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.156037 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.166194 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.183882 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.200656 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.217415 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.230695 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.244092 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.255984 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.258765 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.258793 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.258803 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.258818 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.258828 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.266657 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.281661 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.292476 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.302157 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.318367 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.328825 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.342127 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.356721 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.361575 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.361623 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.361637 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.361656 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.361669 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.371946 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.386196 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.398074 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.409328 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.419130 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.430684 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.443217 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.454313 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.464466 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.464542 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.464561 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.464585 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.464600 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.478554 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.493166 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.508836 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:07Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.567343 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.567403 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.567413 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.567430 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.567439 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.591309 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.591381 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591428 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:44:15.591407778 +0000 UTC m=+37.388039635 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.591449 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.591496 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591549 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.591559 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591569 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591583 4990 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591622 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:15.591611023 +0000 UTC m=+37.388242880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591625 4990 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591645 4990 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591670 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:15.591658745 +0000 UTC m=+37.388290612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591679 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591722 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591741 4990 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591689 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:15.591681065 +0000 UTC m=+37.388312922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.591833 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:15.591811068 +0000 UTC m=+37.388442985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.670827 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.670872 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.670881 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.670898 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.670911 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.774078 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.774124 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.774135 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.774152 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.774163 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.871814 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.871847 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.872046 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:07 crc kubenswrapper[4990]: E1003 09:44:07.872213 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.877192 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.877257 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.877269 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.877291 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.877306 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.980055 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.980102 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.980112 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.980167 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:07 crc kubenswrapper[4990]: I1003 09:44:07.980177 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:07Z","lastTransitionTime":"2025-10-03T09:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.082559 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.082608 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.082621 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.082640 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.082651 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.185551 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.185629 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.185643 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.185666 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.185679 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.287696 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.287742 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.287752 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.287772 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.287784 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.390599 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.390964 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.390976 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.390993 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.391005 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.493256 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.493332 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.493342 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.493358 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.493367 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.596489 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.596608 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.596633 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.596664 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.596684 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.700174 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.700213 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.700225 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.700243 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.700255 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.803317 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.803376 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.803386 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.803400 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.803410 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.871759 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:08 crc kubenswrapper[4990]: E1003 09:44:08.871908 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.887273 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:08Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.905364 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:08Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.906275 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.906325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.906337 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.906355 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.906366 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:08Z","lastTransitionTime":"2025-10-03T09:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.925488 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:08Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.941819 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:08Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.961022 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:08Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:08 crc kubenswrapper[4990]: I1003 09:44:08.975191 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:08Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:08.999995 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:08Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.008287 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.008325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.008336 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.008355 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.008367 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.013322 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.023359 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.033669 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.047401 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.063316 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.075855 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.099669 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:09Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.111064 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.111140 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.111160 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.111182 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.111196 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.213388 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.213460 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.213470 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.213483 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.213493 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.317196 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.317678 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.317777 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.317861 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.317943 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.421190 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.421236 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.421249 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.421271 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.421281 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.524034 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.524111 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.524138 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.524172 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.524194 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.627295 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.627375 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.627398 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.627445 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.627473 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.730432 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.730544 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.730563 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.730619 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.730638 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.833811 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.833871 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.833896 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.833925 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.833947 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.870874 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.870992 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:09 crc kubenswrapper[4990]: E1003 09:44:09.871038 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:09 crc kubenswrapper[4990]: E1003 09:44:09.871130 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.937478 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.937557 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.937573 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.937598 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:09 crc kubenswrapper[4990]: I1003 09:44:09.937614 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:09Z","lastTransitionTime":"2025-10-03T09:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.040757 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.040823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.040844 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.040868 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.040886 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.135449 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/0.log" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.138832 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5" exitCode=1 Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.138898 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.139628 4990 scope.go:117] "RemoveContainer" containerID="133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.143162 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.143231 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.143249 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.143270 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.143325 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.158708 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.173918 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.191036 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.210369 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.228605 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.244738 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.246333 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.246360 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.246372 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.246390 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.246402 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.261908 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.274941 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.287113 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.300564 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.311907 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.331369 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:09Z\\\",\\\"message\\\":\\\"nNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 09:44:09.515432 6297 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 09:44:09.515475 6297 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 09:44:09.515534 6297 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 09:44:09.515546 6297 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 09:44:09.515551 6297 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 09:44:09.515570 6297 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 09:44:09.515582 6297 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 09:44:09.515592 6297 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 09:44:09.515599 6297 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 09:44:09.516565 6297 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 09:44:09.516595 6297 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 09:44:09.516645 6297 factory.go:656] Stopping watch factory\\\\nI1003 09:44:09.516660 6297 ovnkube.go:599] Stopped ovnkube\\\\nI1003 09:44:09.516685 6297 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.345892 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.348705 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.348739 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.348748 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.348765 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.348774 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.362112 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:10Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.451341 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.451393 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.451406 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.451427 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.451439 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.554181 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.554224 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.554234 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.554253 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.554263 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.656645 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.656702 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.656718 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.656740 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.656754 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.764646 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.764703 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.764713 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.764731 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.764740 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.868009 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.868069 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.868081 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.868101 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.868115 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.871280 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:10 crc kubenswrapper[4990]: E1003 09:44:10.871405 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.970405 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.970457 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.970467 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.970485 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:10 crc kubenswrapper[4990]: I1003 09:44:10.970498 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:10Z","lastTransitionTime":"2025-10-03T09:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.073379 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.073421 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.073432 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.073452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.073466 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.145654 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/0.log" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.149474 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.150129 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.164173 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.176541 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.176591 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.176610 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.176631 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.176644 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.179529 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.194614 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.208343 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.222778 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.239705 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.253011 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.264402 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.278712 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.278750 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.278758 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.278773 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.278783 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.287093 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:09Z\\\",\\\"message\\\":\\\"nNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 09:44:09.515432 6297 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 09:44:09.515475 6297 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 09:44:09.515534 6297 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 09:44:09.515546 6297 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 09:44:09.515551 6297 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 09:44:09.515570 6297 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 09:44:09.515582 6297 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 09:44:09.515592 6297 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 09:44:09.515599 6297 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 09:44:09.516565 6297 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 09:44:09.516595 6297 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 09:44:09.516645 6297 factory.go:656] Stopping watch factory\\\\nI1003 09:44:09.516660 6297 ovnkube.go:599] Stopped ovnkube\\\\nI1003 09:44:09.516685 6297 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.300108 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.312989 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.324180 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.335917 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.347681 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.382051 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.382100 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.382110 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.382126 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.382137 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.453966 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4"] Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.454814 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.457778 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.458834 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.480089 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.485643 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.485716 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.485741 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.485774 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.485799 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.496109 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.510611 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.524755 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.532595 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.532681 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.532710 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.532740 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x8zq\" (UniqueName: \"kubernetes.io/projected/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-kube-api-access-2x8zq\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.541186 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.555584 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.574897 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.588974 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.589002 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.589011 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.589028 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.589038 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.594626 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.611858 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.624481 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.633482 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.633525 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.633549 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x8zq\" (UniqueName: \"kubernetes.io/projected/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-kube-api-access-2x8zq\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.633572 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.634260 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.634460 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.640705 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.651281 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x8zq\" (UniqueName: \"kubernetes.io/projected/a29e92f6-66b6-445b-b7c2-a708c69f6c3e-kube-api-access-2x8zq\") pod \"ovnkube-control-plane-749d76644c-l8sx4\" (UID: \"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.652813 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:09Z\\\",\\\"message\\\":\\\"nNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 09:44:09.515432 6297 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 09:44:09.515475 6297 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 09:44:09.515534 6297 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 09:44:09.515546 6297 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 09:44:09.515551 6297 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 09:44:09.515570 6297 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 09:44:09.515582 6297 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 09:44:09.515592 6297 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 09:44:09.515599 6297 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 09:44:09.516565 6297 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 09:44:09.516595 6297 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 09:44:09.516645 6297 factory.go:656] Stopping watch factory\\\\nI1003 09:44:09.516660 6297 ovnkube.go:599] Stopped ovnkube\\\\nI1003 09:44:09.516685 6297 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.668134 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.681720 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.692818 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.692887 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.692905 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.692927 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.692942 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.693119 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.707838 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:11Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.772342 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" Oct 03 09:44:11 crc kubenswrapper[4990]: W1003 09:44:11.794395 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda29e92f6_66b6_445b_b7c2_a708c69f6c3e.slice/crio-a607a2d7a1326d25a2e2fa5db624223d56415dc7cac76cb94dde26313907b453 WatchSource:0}: Error finding container a607a2d7a1326d25a2e2fa5db624223d56415dc7cac76cb94dde26313907b453: Status 404 returned error can't find the container with id a607a2d7a1326d25a2e2fa5db624223d56415dc7cac76cb94dde26313907b453 Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.796414 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.796452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.796466 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.796485 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.796498 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.870866 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.870922 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:11 crc kubenswrapper[4990]: E1003 09:44:11.871027 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:11 crc kubenswrapper[4990]: E1003 09:44:11.871126 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.899741 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.899785 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.899800 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.899819 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:11 crc kubenswrapper[4990]: I1003 09:44:11.899862 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:11Z","lastTransitionTime":"2025-10-03T09:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.002092 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.002133 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.002148 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.002168 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.002183 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.105989 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.106056 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.106069 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.106102 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.106121 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.156275 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/1.log" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.156901 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/0.log" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.159688 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c" exitCode=1 Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.159776 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.159812 4990 scope.go:117] "RemoveContainer" containerID="133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.160740 4990 scope.go:117] "RemoveContainer" containerID="ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c" Oct 03 09:44:12 crc kubenswrapper[4990]: E1003 09:44:12.160935 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.164191 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" event={"ID":"a29e92f6-66b6-445b-b7c2-a708c69f6c3e","Type":"ContainerStarted","Data":"955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.164250 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" event={"ID":"a29e92f6-66b6-445b-b7c2-a708c69f6c3e","Type":"ContainerStarted","Data":"a607a2d7a1326d25a2e2fa5db624223d56415dc7cac76cb94dde26313907b453"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.175484 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.188440 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.204568 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.209012 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.209049 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.209059 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.209075 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.209085 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.220405 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.237839 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.253715 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.265136 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.292141 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.303108 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.310940 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.310971 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.310982 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.310998 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.311007 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.312297 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.321922 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.343857 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.360871 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.392427 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133a47b2efc92f1c669d10f02b68d530085d23e3e1caaa50e7ba59ad237befb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:09Z\\\",\\\"message\\\":\\\"nNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1003 09:44:09.515432 6297 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 09:44:09.515475 6297 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 09:44:09.515534 6297 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 09:44:09.515546 6297 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 09:44:09.515551 6297 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1003 09:44:09.515570 6297 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 09:44:09.515582 6297 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 09:44:09.515592 6297 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 09:44:09.515599 6297 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 09:44:09.516565 6297 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 09:44:09.516595 6297 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 09:44:09.516645 6297 factory.go:656] Stopping watch factory\\\\nI1003 09:44:09.516660 6297 ovnkube.go:599] Stopped ovnkube\\\\nI1003 09:44:09.516685 6297 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 09\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"or network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9192, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1003 09:44:11.208067 6419 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 2.796173ms\\\\nI1003 09:44:11.208645 6419 services_controller.go:444] Built service openshift-machine-api/cluster-autoscaler-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1003 09:44:11.208658 6419 services_controller.go:445] Built service openshift-machine-api/cluster-autoscaler-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1003 09:44:11.208663 6419 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.405864 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:12Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.413638 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.413809 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.413881 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.413963 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.414030 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.517210 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.517534 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.517650 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.517729 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.517800 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.620417 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.620477 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.620495 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.620549 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.620568 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.723100 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.723145 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.723158 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.723179 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.723191 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.826398 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.826459 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.826490 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.826570 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.826610 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.872533 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:12 crc kubenswrapper[4990]: E1003 09:44:12.872701 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.928763 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.928799 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.928808 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.928823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:12 crc kubenswrapper[4990]: I1003 09:44:12.928834 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:12Z","lastTransitionTime":"2025-10-03T09:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.032038 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.032111 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.032130 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.032155 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.032173 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.134763 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.134829 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.134842 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.134868 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.134881 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.171369 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/1.log" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.176073 4990 scope.go:117] "RemoveContainer" containerID="ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c" Oct 03 09:44:13 crc kubenswrapper[4990]: E1003 09:44:13.176327 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.177957 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" event={"ID":"a29e92f6-66b6-445b-b7c2-a708c69f6c3e","Type":"ContainerStarted","Data":"5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.190961 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.205394 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.218848 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.238237 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.238313 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.238325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.238346 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.238371 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.248362 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"or network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9192, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1003 09:44:11.208067 6419 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 2.796173ms\\\\nI1003 09:44:11.208645 6419 services_controller.go:444] Built service openshift-machine-api/cluster-autoscaler-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1003 09:44:11.208658 6419 services_controller.go:445] Built service openshift-machine-api/cluster-autoscaler-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1003 09:44:11.208663 6419 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.263908 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.279701 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.294786 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.309144 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.312986 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gdrcw"] Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.313527 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:13 crc kubenswrapper[4990]: E1003 09:44:13.313598 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.324596 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.335617 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.341099 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.341135 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.341145 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.341160 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.341170 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.348330 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.348377 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66shp\" (UniqueName: \"kubernetes.io/projected/b2a21582-ac04-4caa-a823-7c30c7f788c9-kube-api-access-66shp\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.349673 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.362775 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.379440 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.395570 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.412075 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.429287 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.444027 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.444070 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.444082 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.444100 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.444112 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.446622 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.448834 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.448882 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66shp\" (UniqueName: \"kubernetes.io/projected/b2a21582-ac04-4caa-a823-7c30c7f788c9-kube-api-access-66shp\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:13 crc kubenswrapper[4990]: E1003 09:44:13.449023 4990 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:13 crc kubenswrapper[4990]: E1003 09:44:13.449125 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs podName:b2a21582-ac04-4caa-a823-7c30c7f788c9 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:13.949102466 +0000 UTC m=+35.745734333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs") pod "network-metrics-daemon-gdrcw" (UID: "b2a21582-ac04-4caa-a823-7c30c7f788c9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.460078 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.469281 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66shp\" (UniqueName: \"kubernetes.io/projected/b2a21582-ac04-4caa-a823-7c30c7f788c9-kube-api-access-66shp\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.489482 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"or network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9192, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1003 09:44:11.208067 6419 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 2.796173ms\\\\nI1003 09:44:11.208645 6419 services_controller.go:444] Built service openshift-machine-api/cluster-autoscaler-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1003 09:44:11.208658 6419 services_controller.go:445] Built service openshift-machine-api/cluster-autoscaler-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1003 09:44:11.208663 6419 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.506385 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.524555 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.536947 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.547411 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.547461 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.547471 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.547489 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.547500 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.560696 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.577712 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.597493 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.619820 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.634774 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.649103 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.649988 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.650032 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.650042 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.650063 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.650076 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.660876 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.670885 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.680755 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:13Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.753827 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.753963 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.753992 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.754022 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.754051 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.857677 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.857728 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.857742 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.857763 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.857780 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.871818 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.871818 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:13 crc kubenswrapper[4990]: E1003 09:44:13.872096 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:13 crc kubenswrapper[4990]: E1003 09:44:13.872165 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.953802 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:13 crc kubenswrapper[4990]: E1003 09:44:13.954038 4990 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:13 crc kubenswrapper[4990]: E1003 09:44:13.954136 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs podName:b2a21582-ac04-4caa-a823-7c30c7f788c9 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:14.954115717 +0000 UTC m=+36.750747564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs") pod "network-metrics-daemon-gdrcw" (UID: "b2a21582-ac04-4caa-a823-7c30c7f788c9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.960571 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.960603 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.960613 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.960629 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:13 crc kubenswrapper[4990]: I1003 09:44:13.960639 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:13Z","lastTransitionTime":"2025-10-03T09:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.064049 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.064139 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.064163 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.064196 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.064220 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.167829 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.167881 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.167890 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.167908 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.167919 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.271387 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.271428 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.271435 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.271451 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.271460 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.374122 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.374179 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.374190 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.374206 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.374216 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.476782 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.476820 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.476830 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.476844 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.476852 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.580252 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.580363 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.580386 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.580421 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.580443 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.683680 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.683721 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.683731 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.683746 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.683756 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.786897 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.787380 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.787403 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.787427 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.787445 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.871817 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.871892 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.871986 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.872064 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.890013 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.890061 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.890072 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.890088 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.890100 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.896575 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.896644 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.896667 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.896708 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.896734 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.913299 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:14Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.916866 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.916907 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.916921 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.916939 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.916955 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.929912 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:14Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.933684 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.933739 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.933753 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.933772 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.933785 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.948331 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:14Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.951852 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.951891 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.951901 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.951918 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.951928 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.961665 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.961841 4990 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.961920 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs podName:b2a21582-ac04-4caa-a823-7c30c7f788c9 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:16.961898439 +0000 UTC m=+38.758530326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs") pod "network-metrics-daemon-gdrcw" (UID: "b2a21582-ac04-4caa-a823-7c30c7f788c9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.963817 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:14Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.970461 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.970525 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.970538 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.970557 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.970570 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.985216 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:14Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:14 crc kubenswrapper[4990]: E1003 09:44:14.985371 4990 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.993579 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.993651 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.993675 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.993706 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:14 crc kubenswrapper[4990]: I1003 09:44:14.993728 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:14Z","lastTransitionTime":"2025-10-03T09:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.097762 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.097804 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.097816 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.097838 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.097852 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.201132 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.201208 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.201231 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.201260 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.201282 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.304776 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.304846 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.304856 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.304878 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.304891 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.408540 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.408596 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.408610 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.408630 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.408642 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.512578 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.512651 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.512676 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.512703 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.512720 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.615933 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.616005 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.616017 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.616037 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.616051 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.667614 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.667759 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.667858 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:44:31.667822434 +0000 UTC m=+53.464454341 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.667933 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.667961 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.667954 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.667979 4990 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.667999 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668051 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:31.66802726 +0000 UTC m=+53.464659137 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.668087 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668120 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668140 4990 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668145 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668160 4990 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668182 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:31.668174994 +0000 UTC m=+53.464806851 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668195 4990 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668202 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:31.668188384 +0000 UTC m=+53.464820261 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.668244 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:31.668234605 +0000 UTC m=+53.464866472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.719727 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.719795 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.719813 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.719842 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.719862 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.822848 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.822887 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.822896 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.822909 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.822919 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.871292 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.871310 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.871501 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:15 crc kubenswrapper[4990]: E1003 09:44:15.871682 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.926012 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.926088 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.926112 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.926142 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:15 crc kubenswrapper[4990]: I1003 09:44:15.926166 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:15Z","lastTransitionTime":"2025-10-03T09:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.029062 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.029125 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.029164 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.029198 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.029220 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.131782 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.131821 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.131829 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.131843 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.131852 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.235289 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.235374 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.235384 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.235401 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.235411 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.338319 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.338367 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.338382 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.338406 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.338418 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.440525 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.440567 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.440575 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.440593 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.440603 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.543689 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.543760 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.543784 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.543812 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.543842 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.647252 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.647321 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.647344 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.647375 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.647400 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.751569 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.751617 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.751627 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.751646 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.751656 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.855082 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.855135 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.855144 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.855162 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.855171 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.871571 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.871589 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:16 crc kubenswrapper[4990]: E1003 09:44:16.871789 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:16 crc kubenswrapper[4990]: E1003 09:44:16.871855 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.962116 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.962176 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.962188 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.962207 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.962222 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:16Z","lastTransitionTime":"2025-10-03T09:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:16 crc kubenswrapper[4990]: I1003 09:44:16.979106 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:16 crc kubenswrapper[4990]: E1003 09:44:16.979309 4990 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:16 crc kubenswrapper[4990]: E1003 09:44:16.979438 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs podName:b2a21582-ac04-4caa-a823-7c30c7f788c9 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:20.979403504 +0000 UTC m=+42.776035401 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs") pod "network-metrics-daemon-gdrcw" (UID: "b2a21582-ac04-4caa-a823-7c30c7f788c9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.064463 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.064538 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.064551 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.064573 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.064586 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.167448 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.167556 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.167578 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.167605 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.167622 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.270836 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.270890 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.270902 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.270923 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.270936 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.374603 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.374642 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.374650 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.374668 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.374677 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.477585 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.477707 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.477739 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.477774 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.477803 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.581011 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.581059 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.581071 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.581089 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.581102 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.684338 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.684396 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.684410 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.684435 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.684451 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.787152 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.787203 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.787219 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.787243 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.787260 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.870736 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.870813 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:17 crc kubenswrapper[4990]: E1003 09:44:17.870873 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:17 crc kubenswrapper[4990]: E1003 09:44:17.870957 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.889676 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.889736 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.889755 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.889782 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.889802 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.992612 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.992664 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.992685 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.992708 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:17 crc kubenswrapper[4990]: I1003 09:44:17.992725 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:17Z","lastTransitionTime":"2025-10-03T09:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.095926 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.095999 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.096021 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.096046 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.096064 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.198783 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.198842 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.198854 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.198869 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.198879 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.301552 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.301617 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.301630 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.301651 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.301664 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.405588 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.405652 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.405670 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.405693 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.405708 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.509223 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.509289 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.509300 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.509318 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.509689 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.612126 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.612166 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.612178 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.612196 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.612205 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.715431 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.715486 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.715498 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.715533 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.715545 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.818347 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.818423 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.818440 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.818481 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.818492 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.871545 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.871564 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:18 crc kubenswrapper[4990]: E1003 09:44:18.871698 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:18 crc kubenswrapper[4990]: E1003 09:44:18.871761 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.892700 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:18Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.911236 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:18Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.921179 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.921218 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.921230 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.921247 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.921260 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:18Z","lastTransitionTime":"2025-10-03T09:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.927217 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:18Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.940379 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:18Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.954191 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:18Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.968759 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:18Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:18 crc kubenswrapper[4990]: I1003 09:44:18.988669 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"or network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9192, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1003 09:44:11.208067 6419 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 2.796173ms\\\\nI1003 09:44:11.208645 6419 services_controller.go:444] Built service openshift-machine-api/cluster-autoscaler-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1003 09:44:11.208658 6419 services_controller.go:445] Built service openshift-machine-api/cluster-autoscaler-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1003 09:44:11.208663 6419 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:18Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.000593 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:18Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.012861 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.025340 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.025823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.025864 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.025878 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.025895 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.025910 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.037258 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.050375 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.063161 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.081140 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.095587 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.111870 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:19Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.128887 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.128958 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.128970 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.128988 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.129001 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.232189 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.232242 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.232257 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.232275 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.232286 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.334639 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.334686 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.334696 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.334711 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.334719 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.438249 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.438478 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.438486 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.438501 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.438521 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.540730 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.540778 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.540792 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.540821 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.540833 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.643368 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.643429 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.643445 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.643470 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.643486 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.746822 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.746882 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.746909 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.746940 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.746964 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.850714 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.850789 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.850813 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.850842 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.850864 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.871016 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.871026 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:19 crc kubenswrapper[4990]: E1003 09:44:19.871196 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:19 crc kubenswrapper[4990]: E1003 09:44:19.871329 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.955407 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.955464 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.955477 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.955494 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:19 crc kubenswrapper[4990]: I1003 09:44:19.955505 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:19Z","lastTransitionTime":"2025-10-03T09:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.059261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.059301 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.059310 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.059322 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.059332 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.161658 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.161724 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.161739 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.161761 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.161776 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.264999 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.265060 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.265076 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.265097 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.265111 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.367834 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.367892 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.367908 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.367929 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.367942 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.470454 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.470542 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.470558 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.470581 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.470593 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.574680 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.574779 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.575412 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.575498 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.575816 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.678647 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.678675 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.678683 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.678698 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.678707 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.781178 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.781243 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.781254 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.781287 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.781299 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.871831 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.871825 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:20 crc kubenswrapper[4990]: E1003 09:44:20.872150 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:20 crc kubenswrapper[4990]: E1003 09:44:20.872279 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.872453 4990 scope.go:117] "RemoveContainer" containerID="08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.884044 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.884097 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.884111 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.884130 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.884141 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.986963 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.987028 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.987040 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.987058 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:20 crc kubenswrapper[4990]: I1003 09:44:20.987447 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:20Z","lastTransitionTime":"2025-10-03T09:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.023925 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:21 crc kubenswrapper[4990]: E1003 09:44:21.024141 4990 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:21 crc kubenswrapper[4990]: E1003 09:44:21.024238 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs podName:b2a21582-ac04-4caa-a823-7c30c7f788c9 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:29.024212728 +0000 UTC m=+50.820844595 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs") pod "network-metrics-daemon-gdrcw" (UID: "b2a21582-ac04-4caa-a823-7c30c7f788c9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.090497 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.090550 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.090563 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.090579 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.090590 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.194867 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.194957 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.194974 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.194998 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.195049 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.207414 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.210129 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.211499 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.237769 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.252325 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.265239 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.282291 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.299883 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.299936 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.299948 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.299966 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.299978 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.301238 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.327993 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.342051 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.356960 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.374249 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.386828 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.403308 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.403359 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.403369 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.403384 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.403394 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.404755 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"or network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9192, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1003 09:44:11.208067 6419 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 2.796173ms\\\\nI1003 09:44:11.208645 6419 services_controller.go:444] Built service openshift-machine-api/cluster-autoscaler-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1003 09:44:11.208658 6419 services_controller.go:445] Built service openshift-machine-api/cluster-autoscaler-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1003 09:44:11.208663 6419 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.418060 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.435682 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.448977 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.462765 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.476715 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:21Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.505946 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.505988 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.506000 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.506018 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.506030 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.608982 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.609057 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.609095 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.609125 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.609143 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.711998 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.712036 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.712047 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.712062 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.712074 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.815224 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.815300 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.815327 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.815361 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.815380 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.871317 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:21 crc kubenswrapper[4990]: E1003 09:44:21.871467 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.871317 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:21 crc kubenswrapper[4990]: E1003 09:44:21.871668 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.919032 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.919077 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.919091 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.919110 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:21 crc kubenswrapper[4990]: I1003 09:44:21.919124 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:21Z","lastTransitionTime":"2025-10-03T09:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.022116 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.022176 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.022192 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.022216 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.022236 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.125905 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.125958 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.125979 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.126006 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.126026 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.228336 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.228370 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.228380 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.228393 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.228401 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.331799 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.331856 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.331872 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.331898 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.331915 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.434821 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.434867 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.434882 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.434899 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.434910 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.537561 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.537605 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.537616 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.537634 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.537645 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.640298 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.640351 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.640371 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.640389 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.640403 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.743135 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.743187 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.743202 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.743218 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.743230 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.846261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.846331 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.846341 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.846354 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.846363 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.870824 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.870842 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:22 crc kubenswrapper[4990]: E1003 09:44:22.871019 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:22 crc kubenswrapper[4990]: E1003 09:44:22.871102 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.948216 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.948255 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.948266 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.948282 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:22 crc kubenswrapper[4990]: I1003 09:44:22.948293 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:22Z","lastTransitionTime":"2025-10-03T09:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.050737 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.050784 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.050793 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.050806 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.050815 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.154941 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.155001 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.155015 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.155031 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.155045 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.258011 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.258061 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.258071 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.258085 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.258094 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.361264 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.361305 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.361315 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.361330 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.361340 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.464265 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.464325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.464334 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.464353 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.464364 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.567978 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.568109 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.568129 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.568155 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.568172 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.671026 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.671096 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.671118 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.671147 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.671169 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.774720 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.774772 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.774783 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.774805 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.774818 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.871181 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:23 crc kubenswrapper[4990]: E1003 09:44:23.871304 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.871667 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:23 crc kubenswrapper[4990]: E1003 09:44:23.871720 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.872390 4990 scope.go:117] "RemoveContainer" containerID="ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.876666 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.876723 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.876747 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.876776 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.876801 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.980823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.980929 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.980946 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.980969 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:23 crc kubenswrapper[4990]: I1003 09:44:23.980983 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:23Z","lastTransitionTime":"2025-10-03T09:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.084318 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.084399 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.084423 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.084452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.084470 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.187670 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.187726 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.187738 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.187760 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.187777 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.291090 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.291148 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.291159 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.291179 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.291191 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.393957 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.394011 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.394028 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.394053 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.394072 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.496699 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.496751 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.496763 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.496780 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.496791 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.599306 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.599348 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.599358 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.599375 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.599387 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.701734 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.701782 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.701793 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.701812 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.701823 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.804325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.804368 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.804379 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.804395 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.804403 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.871087 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.871136 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:24 crc kubenswrapper[4990]: E1003 09:44:24.871253 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:24 crc kubenswrapper[4990]: E1003 09:44:24.871334 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.907170 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.907217 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.907231 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.907249 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:24 crc kubenswrapper[4990]: I1003 09:44:24.907261 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:24Z","lastTransitionTime":"2025-10-03T09:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.010719 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.011150 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.011164 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.011183 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.011196 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.037236 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.037277 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.037288 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.037306 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.037320 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.055209 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.058776 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.058809 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.058818 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.058833 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.058845 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.090619 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.095829 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.095882 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.095897 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.095922 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.095938 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.128876 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.133215 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.133254 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.133265 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.133280 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.133290 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.149920 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.154298 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.154352 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.154366 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.154386 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.154399 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.168384 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.168564 4990 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.170265 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.170301 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.170316 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.170336 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.170350 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.225082 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/2.log" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.225858 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/1.log" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.228817 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615" exitCode=1 Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.228880 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.228979 4990 scope.go:117] "RemoveContainer" containerID="ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.229780 4990 scope.go:117] "RemoveContainer" containerID="f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615" Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.231260 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.244457 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.263488 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.272862 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.272904 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.272916 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.272934 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.272944 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.284907 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.301000 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.318774 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.334764 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.355597 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.370968 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.375704 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.375749 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.375764 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.375785 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.375803 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.385539 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.406154 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"or network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9192, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1003 09:44:11.208067 6419 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 2.796173ms\\\\nI1003 09:44:11.208645 6419 services_controller.go:444] Built service openshift-machine-api/cluster-autoscaler-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1003 09:44:11.208658 6419 services_controller.go:445] Built service openshift-machine-api/cluster-autoscaler-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1003 09:44:11.208663 6419 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.424855 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.443443 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.455191 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.468945 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.478192 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.478250 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.478261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.478281 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.478296 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.482565 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.501951 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:25Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.580939 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.580997 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.581013 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.581037 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.581054 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.683865 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.683931 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.683946 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.683971 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.683988 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.787569 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.787612 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.787621 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.787636 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.787645 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.871460 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.871648 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.872061 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:25 crc kubenswrapper[4990]: E1003 09:44:25.872319 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.890482 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.890551 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.890565 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.890585 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.890599 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.994642 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.994715 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.994736 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.994762 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:25 crc kubenswrapper[4990]: I1003 09:44:25.994777 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:25Z","lastTransitionTime":"2025-10-03T09:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.098579 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.098625 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.098636 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.098652 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.098663 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.202841 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.202890 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.202903 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.202946 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.202962 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.235658 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/2.log" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.306270 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.306332 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.306348 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.306365 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.306376 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.409796 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.409880 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.409892 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.409915 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.409930 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.513743 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.513781 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.513790 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.513805 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.513814 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.616839 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.616881 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.616893 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.616910 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.616922 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.719114 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.719161 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.719173 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.719192 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.719205 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.822724 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.822765 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.822774 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.822793 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.822803 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.871809 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.871858 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:26 crc kubenswrapper[4990]: E1003 09:44:26.872003 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:26 crc kubenswrapper[4990]: E1003 09:44:26.872156 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.925733 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.925784 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.925796 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.925816 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:26 crc kubenswrapper[4990]: I1003 09:44:26.925831 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:26Z","lastTransitionTime":"2025-10-03T09:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.029151 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.029216 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.029233 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.029259 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.029277 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.133571 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.133645 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.133669 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.133699 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.133723 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.236995 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.237393 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.237577 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.237699 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.237825 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.340571 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.340921 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.341102 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.341206 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.341277 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.443738 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.444122 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.444231 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.444328 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.444435 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.546753 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.546801 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.546813 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.546829 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.546839 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.649387 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.649444 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.649463 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.649486 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.649501 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.752284 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.752347 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.752369 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.752396 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.752411 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.860064 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.861066 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.861125 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.861158 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.861177 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.871233 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.871269 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:27 crc kubenswrapper[4990]: E1003 09:44:27.871400 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:27 crc kubenswrapper[4990]: E1003 09:44:27.871899 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.964862 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.965328 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.965643 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.965835 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:27 crc kubenswrapper[4990]: I1003 09:44:27.965995 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:27Z","lastTransitionTime":"2025-10-03T09:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.069579 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.069663 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.069691 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.069720 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.069741 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.173189 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.173282 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.173306 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.173341 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.173364 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.276387 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.276436 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.276453 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.276477 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.276496 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.379246 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.379395 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.379425 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.379494 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.379576 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.482208 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.482253 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.482268 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.482289 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.482308 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.585223 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.585277 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.585292 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.585313 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.585327 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.687770 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.687836 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.687853 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.687871 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.687883 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.790260 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.790323 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.790334 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.790349 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.790358 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.871114 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.871158 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:28 crc kubenswrapper[4990]: E1003 09:44:28.871272 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:28 crc kubenswrapper[4990]: E1003 09:44:28.871453 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.893141 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.893191 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.893201 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.893218 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.893233 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.906267 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef7800b801735c23bbb2dcbeec9f3e455556a791c7f25c11f11e3833771a3e8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"message\\\":\\\"or network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.245\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9192, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1003 09:44:11.208067 6419 services_controller.go:360] Finished syncing service metrics on namespace openshift-controller-manager-operator for network=default : 2.796173ms\\\\nI1003 09:44:11.208645 6419 services_controller.go:444] Built service openshift-machine-api/cluster-autoscaler-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1003 09:44:11.208658 6419 services_controller.go:445] Built service openshift-machine-api/cluster-autoscaler-operator LB template configs for network=default: []services.lbConfig(nil)\\\\nF1003 09:44:11.208663 6419 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:28Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.920705 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:28Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.935215 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:28Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.954184 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:28Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.972684 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:28Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.991875 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:28Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.995899 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.995935 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.995949 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.995971 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:28 crc kubenswrapper[4990]: I1003 09:44:28.995989 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:28Z","lastTransitionTime":"2025-10-03T09:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.007152 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.023759 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.026489 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:29 crc kubenswrapper[4990]: E1003 09:44:29.026694 4990 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:29 crc kubenswrapper[4990]: E1003 09:44:29.026793 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs podName:b2a21582-ac04-4caa-a823-7c30c7f788c9 nodeName:}" failed. No retries permitted until 2025-10-03 09:44:45.026768008 +0000 UTC m=+66.823399865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs") pod "network-metrics-daemon-gdrcw" (UID: "b2a21582-ac04-4caa-a823-7c30c7f788c9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.039473 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.057365 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.073618 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.088829 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.098495 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.098791 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.098810 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.098830 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.098843 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.101463 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.116803 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.132947 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.147452 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:29Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.201623 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.201689 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.201729 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.201766 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.201781 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.304579 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.304634 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.304649 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.304670 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.304683 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.407565 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.407610 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.407623 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.407640 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.407652 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.510389 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.510449 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.510460 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.510479 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.510491 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.613173 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.613225 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.613236 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.613281 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.613291 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.716369 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.716445 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.716456 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.716476 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.716489 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.819503 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.819584 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.819598 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.819616 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.819629 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.871874 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.871985 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:29 crc kubenswrapper[4990]: E1003 09:44:29.872136 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:29 crc kubenswrapper[4990]: E1003 09:44:29.872253 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.922951 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.923784 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.923841 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.923881 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:29 crc kubenswrapper[4990]: I1003 09:44:29.923896 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:29Z","lastTransitionTime":"2025-10-03T09:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.027126 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.027169 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.027179 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.027194 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.027221 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.129925 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.129978 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.129989 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.130006 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.130017 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.233952 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.234036 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.234059 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.234093 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.234114 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.337269 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.337349 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.337373 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.337405 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.337432 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.440701 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.440785 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.440823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.440860 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.440883 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.543627 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.543668 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.543680 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.543696 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.543706 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.647009 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.647075 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.647098 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.647129 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.647155 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.750862 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.750906 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.750917 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.750936 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.750949 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.853133 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.853169 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.853179 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.853195 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.853205 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.871685 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.871723 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:30 crc kubenswrapper[4990]: E1003 09:44:30.871822 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:30 crc kubenswrapper[4990]: E1003 09:44:30.871885 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.957740 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.957785 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.957793 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.957810 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:30 crc kubenswrapper[4990]: I1003 09:44:30.957820 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:30Z","lastTransitionTime":"2025-10-03T09:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.061752 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.061805 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.061818 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.061850 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.061864 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.163738 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.163779 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.163791 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.163810 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.163822 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.265894 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.265934 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.265942 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.265957 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.265966 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.370157 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.370230 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.370259 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.370294 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.370319 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.472197 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.472253 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.472266 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.472286 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.472296 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.574808 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.574872 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.574889 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.574915 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.574933 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.677538 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.677579 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.677586 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.677601 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.677610 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.757278 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.757622 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:03.757585761 +0000 UTC m=+85.554217618 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.757732 4990 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.757826 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:45:03.757802546 +0000 UTC m=+85.554434433 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.757494 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.758127 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.758271 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.758302 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.758322 4990 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.758376 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 09:45:03.758361741 +0000 UTC m=+85.554993628 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.758882 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.759015 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.759046 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.759062 4990 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.759127 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 09:45:03.75911178 +0000 UTC m=+85.555743667 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.759197 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.759290 4990 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.759347 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:45:03.759333806 +0000 UTC m=+85.555965693 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.781163 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.781240 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.781253 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.781278 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.781293 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.870946 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.871073 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.871122 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:31 crc kubenswrapper[4990]: E1003 09:44:31.871258 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.884661 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.884725 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.884746 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.884780 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.884807 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.988113 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.988182 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.988199 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.988226 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:31 crc kubenswrapper[4990]: I1003 09:44:31.988244 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:31Z","lastTransitionTime":"2025-10-03T09:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.090852 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.090897 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.090908 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.090925 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.090937 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.194783 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.194856 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.194881 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.194912 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.194937 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.298507 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.298603 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.298618 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.298651 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.298672 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.402490 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.402590 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.402614 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.402645 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.402666 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.505579 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.505639 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.505649 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.505667 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.505681 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.609197 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.609245 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.609258 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.609277 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.609289 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.712725 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.712788 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.712810 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.712836 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.712854 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.816081 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.816120 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.816131 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.816149 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.816159 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.871243 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.871277 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:32 crc kubenswrapper[4990]: E1003 09:44:32.871406 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:32 crc kubenswrapper[4990]: E1003 09:44:32.871569 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.919284 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.919351 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.919362 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.919379 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:32 crc kubenswrapper[4990]: I1003 09:44:32.919391 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:32Z","lastTransitionTime":"2025-10-03T09:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.023629 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.023698 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.023733 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.023773 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.023794 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.126966 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.127015 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.127027 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.127045 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.127057 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.230008 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.230069 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.230081 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.230101 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.230113 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.332833 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.332882 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.332896 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.332916 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.332928 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.435768 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.435821 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.435840 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.435865 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.435882 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.539465 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.539616 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.539637 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.539665 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.539683 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.641947 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.641981 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.641989 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.642004 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.642016 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.744362 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.744800 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.744893 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.744990 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.745079 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.849082 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.849139 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.849190 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.849213 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.849811 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.870844 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.870893 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:33 crc kubenswrapper[4990]: E1003 09:44:33.870978 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:33 crc kubenswrapper[4990]: E1003 09:44:33.871125 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.954017 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.954099 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.954118 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.954144 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:33 crc kubenswrapper[4990]: I1003 09:44:33.954168 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:33Z","lastTransitionTime":"2025-10-03T09:44:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.057472 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.057542 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.057555 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.057575 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.057587 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.159523 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.159573 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.159812 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.159830 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.159841 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.262690 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.262738 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.262750 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.262774 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.262783 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.365681 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.365744 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.365764 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.365789 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.365809 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.468815 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.468872 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.468884 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.468903 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.468918 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.572431 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.572477 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.572486 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.572501 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.572543 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.675295 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.675351 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.675363 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.675384 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.675400 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.777865 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.777918 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.777933 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.777952 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.777968 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.871443 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.871566 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:34 crc kubenswrapper[4990]: E1003 09:44:34.871614 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:34 crc kubenswrapper[4990]: E1003 09:44:34.871712 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.880373 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.880432 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.880452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.880477 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.880494 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.983163 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.983220 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.983233 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.983251 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:34 crc kubenswrapper[4990]: I1003 09:44:34.983264 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:34Z","lastTransitionTime":"2025-10-03T09:44:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.086185 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.086250 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.086276 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.086307 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.086328 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.188973 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.189021 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.189032 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.189049 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.189060 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.291001 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.291057 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.291066 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.291082 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.291094 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.393875 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.393967 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.394002 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.394022 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.394045 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.411482 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.417772 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.417823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.417835 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.417849 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.417860 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.441848 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.446015 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.446057 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.446071 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.446094 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.446107 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.462027 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.466821 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.466868 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.466882 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.466904 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.466917 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.480765 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.483128 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.483953 4990 scope.go:117] "RemoveContainer" containerID="f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615" Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.484158 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.485302 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.485352 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.485364 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.485381 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.485394 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.486977 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.496676 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.501173 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.501328 4990 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.502119 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.503242 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.503295 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.503310 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.503329 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.503343 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.516342 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.529876 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.544203 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.554358 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.564733 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.575254 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.591371 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.606367 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.606415 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.606428 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.606446 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.606460 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.606964 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.617089 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.638115 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.648927 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.661523 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.673703 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.686013 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.700448 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.709740 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.709803 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.709813 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.709837 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.709874 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.715073 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.730374 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.744587 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.756617 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.768666 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.783331 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.806376 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.813102 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.813154 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.813167 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.813187 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.813200 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.824371 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.839185 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.864890 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.871616 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.871662 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.871731 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:35 crc kubenswrapper[4990]: E1003 09:44:35.871809 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.883348 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.901991 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.916377 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.916422 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.916434 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.916452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.916464 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:35Z","lastTransitionTime":"2025-10-03T09:44:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.920017 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.935694 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.949877 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.961645 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:35 crc kubenswrapper[4990]: I1003 09:44:35.973458 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:35Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.018669 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.018726 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.018737 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.018756 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.018769 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.121360 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.121401 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.121417 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.121440 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.121456 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.224011 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.224061 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.224076 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.224095 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.224109 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.327147 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.327202 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.327213 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.327238 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.327257 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.431283 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.431340 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.431374 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.431398 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.431410 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.535083 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.535142 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.535159 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.535185 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.535205 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.637774 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.637838 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.637846 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.637864 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.637874 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.740598 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.740652 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.740668 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.740687 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.740701 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.843643 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.843687 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.843695 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.843711 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.843721 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.871619 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.871693 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:36 crc kubenswrapper[4990]: E1003 09:44:36.871781 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:36 crc kubenswrapper[4990]: E1003 09:44:36.871925 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.947000 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.947049 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.947061 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.947079 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:36 crc kubenswrapper[4990]: I1003 09:44:36.947092 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:36Z","lastTransitionTime":"2025-10-03T09:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.050121 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.050613 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.050626 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.050646 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.050658 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.154135 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.154216 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.154238 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.154269 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.154368 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.257139 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.257213 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.257237 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.257269 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.257294 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.360881 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.360924 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.360933 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.360947 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.360956 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.464934 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.464992 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.465004 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.465025 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.465039 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.568412 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.568485 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.568498 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.568551 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.568565 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.671029 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.671075 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.671085 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.671100 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.671111 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.774609 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.774649 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.774660 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.774677 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.774688 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.870758 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.870859 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:37 crc kubenswrapper[4990]: E1003 09:44:37.870911 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:37 crc kubenswrapper[4990]: E1003 09:44:37.871055 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.877969 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.878015 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.878026 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.878046 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.878058 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.980775 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.980934 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.980964 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.980995 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:37 crc kubenswrapper[4990]: I1003 09:44:37.981019 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:37Z","lastTransitionTime":"2025-10-03T09:44:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.083681 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.083729 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.083738 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.083756 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.083771 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.186951 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.187003 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.187018 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.187042 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.187057 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.289805 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.289877 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.289893 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.289919 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.289941 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.392890 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.392929 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.392940 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.392957 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.392970 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.495805 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.495850 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.495861 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.495876 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.495885 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.518748 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.538735 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.555716 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.567664 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.586034 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.598915 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.599217 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.599345 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.599449 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.599606 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.602781 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.614872 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.633086 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.645020 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.663595 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.677434 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.696096 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.701716 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.702090 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.702253 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.702400 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.702527 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.710500 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.724443 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.736745 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.746937 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.762315 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.773314 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.805739 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.805791 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.805802 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.805821 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.805835 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.870904 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.870902 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:38 crc kubenswrapper[4990]: E1003 09:44:38.871037 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:38 crc kubenswrapper[4990]: E1003 09:44:38.871078 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.884971 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.896815 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.908540 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.908577 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.908589 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.908605 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.908616 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:38Z","lastTransitionTime":"2025-10-03T09:44:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.909146 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.922955 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.938637 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.951085 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.964647 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.977655 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:38 crc kubenswrapper[4990]: I1003 09:44:38.990552 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:38Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.009766 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.011458 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.011521 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.011539 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.011564 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.011601 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.024307 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.038230 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.052446 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.065899 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.080267 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.094410 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.111990 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:39Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.114047 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.114187 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.114363 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.114606 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.114813 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.217785 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.217834 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.217846 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.217864 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.217876 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.320528 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.320586 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.320599 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.320620 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.320634 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.423893 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.423936 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.423947 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.423963 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.423974 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.526527 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.526580 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.526591 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.526611 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.526623 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.629164 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.629227 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.629240 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.629262 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.629275 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.732307 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.732632 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.732736 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.732843 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.732941 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.836029 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.836329 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.836407 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.836563 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.836662 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.871219 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.871284 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:39 crc kubenswrapper[4990]: E1003 09:44:39.871378 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:39 crc kubenswrapper[4990]: E1003 09:44:39.871484 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.939674 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.940399 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.940536 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.940663 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:39 crc kubenswrapper[4990]: I1003 09:44:39.940792 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:39Z","lastTransitionTime":"2025-10-03T09:44:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.043858 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.043910 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.043920 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.043937 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.043950 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.146691 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.146739 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.146749 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.146770 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.146782 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.249186 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.249233 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.249243 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.249258 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.249271 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.352125 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.352173 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.352183 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.352200 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.352216 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.455317 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.455360 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.455370 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.455387 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.455401 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.558403 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.558459 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.558473 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.558492 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.558520 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.668080 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.668454 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.668590 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.668666 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.668727 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.771603 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.772126 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.772238 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.772360 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.772481 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.871110 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.871182 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:40 crc kubenswrapper[4990]: E1003 09:44:40.871262 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:40 crc kubenswrapper[4990]: E1003 09:44:40.871323 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.875600 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.875629 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.875640 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.875655 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.875667 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.978404 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.978438 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.978448 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.978461 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:40 crc kubenswrapper[4990]: I1003 09:44:40.978470 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:40Z","lastTransitionTime":"2025-10-03T09:44:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.081145 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.081190 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.081204 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.081221 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.081233 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.183181 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.183224 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.183237 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.183254 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.183265 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.286339 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.286410 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.286435 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.286466 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.286487 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.389207 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.389272 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.389282 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.389297 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.389307 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.492257 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.492305 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.492314 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.492333 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.492343 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.595375 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.595444 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.595456 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.595495 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.595520 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.698755 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.698797 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.698808 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.698824 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.698834 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.801407 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.801468 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.801482 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.801500 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.801557 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.871439 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.871492 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:41 crc kubenswrapper[4990]: E1003 09:44:41.871693 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:41 crc kubenswrapper[4990]: E1003 09:44:41.871822 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.904081 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.904121 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.904130 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.904144 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:41 crc kubenswrapper[4990]: I1003 09:44:41.904154 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:41Z","lastTransitionTime":"2025-10-03T09:44:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.007301 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.007368 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.007381 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.007408 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.007423 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.110192 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.110235 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.110243 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.110261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.110274 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.213053 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.213121 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.213138 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.213187 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.213204 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.316588 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.316637 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.316648 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.316668 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.316677 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.419323 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.419388 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.419403 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.419429 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.419454 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.521913 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.521957 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.521968 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.521984 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.521996 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.624959 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.625031 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.625044 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.625062 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.625074 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.728103 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.728171 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.728195 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.728226 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.728249 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.830879 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.830957 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.830983 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.831013 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.831031 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.871707 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.871772 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:42 crc kubenswrapper[4990]: E1003 09:44:42.871869 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:42 crc kubenswrapper[4990]: E1003 09:44:42.871971 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.934242 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.934288 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.934304 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.934365 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:42 crc kubenswrapper[4990]: I1003 09:44:42.934391 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:42Z","lastTransitionTime":"2025-10-03T09:44:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.037201 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.037306 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.037319 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.037338 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.037350 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.139688 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.139736 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.139751 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.139771 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.139785 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.242192 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.242237 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.242253 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.242271 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.242283 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.344899 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.344953 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.344967 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.344983 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.344993 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.447896 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.447935 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.447947 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.447963 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.447974 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.550065 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.550090 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.550098 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.550110 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.550118 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.653140 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.653201 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.653214 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.653477 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.653504 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.757115 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.757164 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.757177 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.757195 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.757207 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.860144 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.860207 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.860219 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.860236 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.860249 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.871622 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.871718 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:43 crc kubenswrapper[4990]: E1003 09:44:43.871796 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:43 crc kubenswrapper[4990]: E1003 09:44:43.871962 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.962253 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.962308 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.962321 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.962340 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:43 crc kubenswrapper[4990]: I1003 09:44:43.962353 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:43Z","lastTransitionTime":"2025-10-03T09:44:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.065833 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.065888 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.065904 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.065926 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.065941 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.169252 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.169331 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.169344 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.169366 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.169382 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.272753 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.272802 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.272813 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.272830 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.272842 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.375091 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.375139 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.375149 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.375165 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.375177 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.477348 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.477390 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.477402 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.477418 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.477431 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.580040 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.580103 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.580120 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.580145 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.580162 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.682732 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.682788 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.682801 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.682823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.682836 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.785173 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.785212 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.785227 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.785245 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.785259 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.871373 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.871494 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:44 crc kubenswrapper[4990]: E1003 09:44:44.871674 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:44 crc kubenswrapper[4990]: E1003 09:44:44.871840 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.887642 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.887687 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.887700 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.887717 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.887729 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.990502 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.990564 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.990579 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.990596 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:44 crc kubenswrapper[4990]: I1003 09:44:44.990611 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:44Z","lastTransitionTime":"2025-10-03T09:44:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.093351 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.093404 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.093414 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.093429 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.093440 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.108264 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.108808 4990 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.108928 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs podName:b2a21582-ac04-4caa-a823-7c30c7f788c9 nodeName:}" failed. No retries permitted until 2025-10-03 09:45:17.108892917 +0000 UTC m=+98.905524774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs") pod "network-metrics-daemon-gdrcw" (UID: "b2a21582-ac04-4caa-a823-7c30c7f788c9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.195757 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.195809 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.195818 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.195833 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.195843 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.299029 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.299102 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.299132 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.299152 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.299167 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.401054 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.401096 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.401107 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.401123 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.401133 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.504402 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.504465 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.504482 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.504506 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.504555 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.607218 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.607261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.607272 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.607289 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.607303 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.710200 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.710240 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.710251 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.710267 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.710278 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.711311 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.711341 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.711351 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.711362 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.711371 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.724488 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:45Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.727940 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.727971 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.727980 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.727995 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.728005 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.739855 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:45Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.745049 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.745092 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.745127 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.745147 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.745158 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.758863 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:45Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.763309 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.763403 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.763423 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.763444 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.763456 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.777055 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:45Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.782095 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.782159 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.782171 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.782191 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.782206 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.796356 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:45Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.796470 4990 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.812476 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.812536 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.812553 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.812573 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.812586 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.871564 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.871671 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.871842 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:45 crc kubenswrapper[4990]: E1003 09:44:45.871885 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.915063 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.915124 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.915135 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.915157 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:45 crc kubenswrapper[4990]: I1003 09:44:45.915170 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:45Z","lastTransitionTime":"2025-10-03T09:44:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.017685 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.017722 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.017732 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.017750 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.017763 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.120023 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.120097 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.120114 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.120142 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.120158 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.223181 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.223421 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.223431 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.223451 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.223462 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.325176 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.325219 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.325230 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.325246 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.325258 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.427621 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.427667 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.427681 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.427698 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.427709 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.530107 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.530158 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.530169 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.530185 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.530196 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.633089 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.633150 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.633164 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.633184 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.633204 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.735809 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.735855 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.735868 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.735884 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.735893 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.839142 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.839213 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.839227 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.839247 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.839261 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.871804 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.871819 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:46 crc kubenswrapper[4990]: E1003 09:44:46.871993 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:46 crc kubenswrapper[4990]: E1003 09:44:46.872101 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.941649 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.941684 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.941693 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.941707 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:46 crc kubenswrapper[4990]: I1003 09:44:46.941717 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:46Z","lastTransitionTime":"2025-10-03T09:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.043986 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.044028 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.044043 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.044065 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.044078 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.146752 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.146802 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.146814 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.146830 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.146842 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.249300 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.249341 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.249351 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.249369 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.249378 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.316586 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bspdz_31671a76-378e-4899-89ae-d27e608c3cda/kube-multus/0.log" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.316644 4990 generic.go:334] "Generic (PLEG): container finished" podID="31671a76-378e-4899-89ae-d27e608c3cda" containerID="8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132" exitCode=1 Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.316693 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bspdz" event={"ID":"31671a76-378e-4899-89ae-d27e608c3cda","Type":"ContainerDied","Data":"8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.317253 4990 scope.go:117] "RemoveContainer" containerID="8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.340163 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.353466 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.353666 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.353683 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.353700 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.353711 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.354823 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.365896 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.383574 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.394888 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.408187 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.422712 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.436071 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.449478 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.456739 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.456794 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.456809 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.456830 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.456846 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.463494 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"2025-10-03T09:44:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e\\\\n2025-10-03T09:44:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e to /host/opt/cni/bin/\\\\n2025-10-03T09:44:02Z [verbose] multus-daemon started\\\\n2025-10-03T09:44:02Z [verbose] Readiness Indicator file check\\\\n2025-10-03T09:44:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.483319 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.496035 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.517921 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.531658 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.545421 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.555950 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.559604 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.559719 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.559794 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.559866 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.559928 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.565934 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:47Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.662857 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.662886 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.662897 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.662912 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.662922 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.764775 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.764811 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.764824 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.764840 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.764852 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.867542 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.867853 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.867966 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.868069 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.868152 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.870829 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:47 crc kubenswrapper[4990]: E1003 09:44:47.870928 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.870829 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:47 crc kubenswrapper[4990]: E1003 09:44:47.871248 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.971329 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.971361 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.971371 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.971385 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:47 crc kubenswrapper[4990]: I1003 09:44:47.971395 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:47Z","lastTransitionTime":"2025-10-03T09:44:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.075208 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.075259 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.075271 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.075291 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.075310 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.178437 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.178477 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.178485 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.178500 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.178529 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.281280 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.281325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.281334 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.281349 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.281361 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.321809 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bspdz_31671a76-378e-4899-89ae-d27e608c3cda/kube-multus/0.log" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.321876 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bspdz" event={"ID":"31671a76-378e-4899-89ae-d27e608c3cda","Type":"ContainerStarted","Data":"906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.336467 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.347396 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.364855 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.374637 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.384040 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.384080 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.384093 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.384112 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.384127 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.386335 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.400252 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.410859 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.424828 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.437549 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"2025-10-03T09:44:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e\\\\n2025-10-03T09:44:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e to /host/opt/cni/bin/\\\\n2025-10-03T09:44:02Z [verbose] multus-daemon started\\\\n2025-10-03T09:44:02Z [verbose] Readiness Indicator file check\\\\n2025-10-03T09:44:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.448321 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.461823 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.474460 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.486653 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.487366 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.487398 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.487407 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.487422 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.487432 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.501068 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.511982 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.520967 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.531275 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.590426 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.590460 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.590469 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.590482 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.590491 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.693391 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.693447 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.693458 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.693475 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.693484 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.796323 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.796371 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.796382 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.796400 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.796412 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.870742 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:48 crc kubenswrapper[4990]: E1003 09:44:48.870880 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.871429 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:48 crc kubenswrapper[4990]: E1003 09:44:48.871499 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.871933 4990 scope.go:117] "RemoveContainer" containerID="f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.889312 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.899227 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.899257 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.899266 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.899280 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.899290 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:48Z","lastTransitionTime":"2025-10-03T09:44:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.903755 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.918195 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"2025-10-03T09:44:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e\\\\n2025-10-03T09:44:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e to /host/opt/cni/bin/\\\\n2025-10-03T09:44:02Z [verbose] multus-daemon started\\\\n2025-10-03T09:44:02Z [verbose] Readiness Indicator file check\\\\n2025-10-03T09:44:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.932843 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.953183 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.975674 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:48 crc kubenswrapper[4990]: I1003 09:44:48.990075 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:48Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.001695 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.001733 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.001746 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.001761 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.001773 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.001925 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.030126 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.039959 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.050971 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.064056 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.076733 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.094453 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.105375 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.106457 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.106478 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.106487 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.106503 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.106532 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.122270 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.134288 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.209196 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.209244 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.209256 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.209273 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.209287 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.312016 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.312055 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.312064 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.312089 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.312100 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.326678 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/2.log" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.352766 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.353571 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.382961 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.400741 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.414049 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.415225 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.415287 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.415302 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.415322 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.415340 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.436786 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.452348 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.466821 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.482213 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.507651 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.517692 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.517729 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.517739 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.517755 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.517766 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.527274 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.555521 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.574820 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.587334 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.604116 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.619284 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"2025-10-03T09:44:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e\\\\n2025-10-03T09:44:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e to /host/opt/cni/bin/\\\\n2025-10-03T09:44:02Z [verbose] multus-daemon started\\\\n2025-10-03T09:44:02Z [verbose] Readiness Indicator file check\\\\n2025-10-03T09:44:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.620427 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.620449 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.620458 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.620489 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.620499 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.634200 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.648734 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.662131 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.722671 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.722724 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.722753 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.722774 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.722787 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.825452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.825854 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.825947 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.826043 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.826130 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.871721 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.871737 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:49 crc kubenswrapper[4990]: E1003 09:44:49.872140 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:49 crc kubenswrapper[4990]: E1003 09:44:49.872253 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.928881 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.928921 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.928931 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.928946 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:49 crc kubenswrapper[4990]: I1003 09:44:49.928956 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:49Z","lastTransitionTime":"2025-10-03T09:44:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.031379 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.031412 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.031420 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.031433 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.031442 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.134298 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.134342 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.134354 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.134375 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.134389 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.237081 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.238123 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.238229 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.238318 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.238404 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.340726 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.340769 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.340779 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.340795 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.340805 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.356876 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/3.log" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.357627 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/2.log" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.359997 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" exitCode=1 Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.360032 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.360104 4990 scope.go:117] "RemoveContainer" containerID="f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.361097 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:44:50 crc kubenswrapper[4990]: E1003 09:44:50.361326 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.384178 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f997d6d077ce9b2b2f2935640d15036898a28d227d7c19ec0607982d4ecba615\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:25Z\\\",\\\"message\\\":\\\"8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 09:44:24.999891 6659 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 09:44:24.999955 6659 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:24Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:24.999925 6659 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/pack\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:49Z\\\",\\\"message\\\":\\\"e-controller-manager-crc\\\\nI1003 09:44:49.836611 6998 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 09:44:49.836618 6998 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 09:44:49.836615 6998 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-lrqf5\\\\nF1003 09:44:49.836628 6998 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:49.836634 6998 obj_retry.go:303] Retry object setup: *v1.\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.396568 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.409285 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.421210 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.430468 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.443233 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.443395 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.443489 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.443595 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.443699 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.443885 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.459347 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"2025-10-03T09:44:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e\\\\n2025-10-03T09:44:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e to /host/opt/cni/bin/\\\\n2025-10-03T09:44:02Z [verbose] multus-daemon started\\\\n2025-10-03T09:44:02Z [verbose] Readiness Indicator file check\\\\n2025-10-03T09:44:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.470883 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.487116 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.503268 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.519960 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.535886 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.546504 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.546549 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.546558 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.546572 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.546584 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.550336 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.561167 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.574626 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.588162 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.600071 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:50Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.649155 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.649192 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.649201 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.649218 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.649228 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.752424 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.752492 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.752508 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.752582 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.752600 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.855243 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.855279 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.855289 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.855304 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.855313 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.871597 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:50 crc kubenswrapper[4990]: E1003 09:44:50.871749 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.871766 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:50 crc kubenswrapper[4990]: E1003 09:44:50.871827 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.957790 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.957833 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.957842 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.957856 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:50 crc kubenswrapper[4990]: I1003 09:44:50.957865 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:50Z","lastTransitionTime":"2025-10-03T09:44:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.059781 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.059828 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.059837 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.059853 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.059862 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.162223 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.162282 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.162295 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.162318 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.162331 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.266247 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.266327 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.266343 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.266374 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.266389 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.365555 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/3.log" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.368720 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.368801 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.368827 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.368859 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.368881 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.370345 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:44:51 crc kubenswrapper[4990]: E1003 09:44:51.370549 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.386661 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.402382 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.417250 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.452671 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:49Z\\\",\\\"message\\\":\\\"e-controller-manager-crc\\\\nI1003 09:44:49.836611 6998 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 09:44:49.836618 6998 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 09:44:49.836615 6998 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-lrqf5\\\\nF1003 09:44:49.836628 6998 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:49.836634 6998 obj_retry.go:303] Retry object setup: *v1.\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.467747 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.472245 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.472306 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.472330 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.472353 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.472372 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.482548 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.504674 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.518381 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.535083 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.549896 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"2025-10-03T09:44:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e\\\\n2025-10-03T09:44:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e to /host/opt/cni/bin/\\\\n2025-10-03T09:44:02Z [verbose] multus-daemon started\\\\n2025-10-03T09:44:02Z [verbose] Readiness Indicator file check\\\\n2025-10-03T09:44:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.565807 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.574317 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.574383 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.574397 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.574413 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.574423 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.579458 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.591530 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.606602 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.620957 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.634666 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.649431 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:51Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.676785 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.676829 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.676842 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.676867 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.676883 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.779505 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.779610 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.779624 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.779642 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.779933 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.871766 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.871831 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:51 crc kubenswrapper[4990]: E1003 09:44:51.871912 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:51 crc kubenswrapper[4990]: E1003 09:44:51.872013 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.883452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.883540 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.883561 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.883584 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.883598 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.986311 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.986381 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.986400 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.986423 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:51 crc kubenswrapper[4990]: I1003 09:44:51.986438 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:51Z","lastTransitionTime":"2025-10-03T09:44:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.089372 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.089430 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.089444 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.089465 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.089479 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.192588 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.192655 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.192668 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.192688 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.192699 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.295399 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.295447 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.295461 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.295478 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.295488 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.397406 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.397463 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.397475 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.397495 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.397510 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.501086 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.501154 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.501166 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.501184 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.501196 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.606406 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.606446 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.606454 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.606468 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.606477 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.708765 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.708835 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.708853 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.708875 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.708894 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.812218 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.812266 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.812285 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.812303 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.812318 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.871488 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.871982 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:52 crc kubenswrapper[4990]: E1003 09:44:52.872105 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:52 crc kubenswrapper[4990]: E1003 09:44:52.875469 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.915666 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.915709 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.915720 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.915735 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:52 crc kubenswrapper[4990]: I1003 09:44:52.915745 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:52Z","lastTransitionTime":"2025-10-03T09:44:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.018734 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.018790 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.018800 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.018815 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.018825 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.122072 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.122136 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.122147 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.122165 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.122203 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.224942 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.224981 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.224989 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.225002 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.225012 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.327368 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.327430 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.327443 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.327464 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.327476 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.430852 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.430906 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.430921 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.430943 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.430957 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.534562 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.534628 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.534648 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.534674 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.534692 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.637646 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.637705 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.637717 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.637741 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.637754 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.740591 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.740664 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.740688 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.740711 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.740727 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.843086 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.843170 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.843188 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.843210 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.843228 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.870900 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.870968 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:53 crc kubenswrapper[4990]: E1003 09:44:53.871036 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:53 crc kubenswrapper[4990]: E1003 09:44:53.871182 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.947281 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.947349 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.947366 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.947392 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:53 crc kubenswrapper[4990]: I1003 09:44:53.947409 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:53Z","lastTransitionTime":"2025-10-03T09:44:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.050191 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.050235 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.050243 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.050261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.050269 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.152894 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.152950 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.152964 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.152983 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.152998 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.256380 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.256439 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.256452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.256545 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.256558 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.359806 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.359848 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.359860 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.359876 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.359891 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.463250 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.463311 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.463322 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.463341 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.463354 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.565713 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.565779 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.565789 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.565805 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.565814 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.669212 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.669267 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.669279 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.669303 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.669322 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.771617 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.771657 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.771670 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.771686 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.771699 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.871840 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.871890 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:54 crc kubenswrapper[4990]: E1003 09:44:54.872086 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:54 crc kubenswrapper[4990]: E1003 09:44:54.872272 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.874386 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.874428 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.874439 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.874457 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.874469 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.976886 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.976933 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.976942 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.976959 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:54 crc kubenswrapper[4990]: I1003 09:44:54.976968 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:54Z","lastTransitionTime":"2025-10-03T09:44:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.080116 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.080169 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.080180 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.080197 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.080209 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.183455 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.183552 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.183590 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.183655 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.183675 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.287444 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.287507 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.287576 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.287609 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.287633 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.390658 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.390718 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.390741 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.390770 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.390794 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.493033 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.493142 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.493156 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.493178 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.493193 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.597372 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.597440 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.597460 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.597485 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.597502 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.699458 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.699540 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.699572 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.699615 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.699639 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.802481 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.802508 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.802518 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.802547 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.802556 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.870768 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:55 crc kubenswrapper[4990]: E1003 09:44:55.870915 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.870768 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:55 crc kubenswrapper[4990]: E1003 09:44:55.871015 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.905225 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.905253 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.905261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.905275 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:55 crc kubenswrapper[4990]: I1003 09:44:55.905284 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:55Z","lastTransitionTime":"2025-10-03T09:44:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.007665 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.007700 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.007716 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.007732 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.007743 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.114475 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.114577 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.114590 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.114608 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.114623 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.116484 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.116778 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.116797 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.116821 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.116838 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: E1003 09:44:56.133101 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:56Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.137953 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.137982 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.137993 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.138008 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.138019 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: E1003 09:44:56.151953 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:56Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.155603 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.155636 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.155647 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.155663 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.155672 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: E1003 09:44:56.167467 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:56Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.171021 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.171049 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.171059 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.171075 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.171087 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: E1003 09:44:56.182435 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:56Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.185682 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.185709 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.185717 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.185731 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.185739 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: E1003 09:44:56.197252 4990 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8954c5f5-a70f-4fb3-9378-33cf06a3d6b1\\\",\\\"systemUUID\\\":\\\"1dbe54b5-0a5d-46a2-9c08-21093914202d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:56Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:56 crc kubenswrapper[4990]: E1003 09:44:56.197358 4990 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.217332 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.217373 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.217409 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.217429 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.217442 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.320616 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.320652 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.320662 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.320682 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.320695 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.422924 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.422983 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.423000 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.423026 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.423042 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.525762 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.525816 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.525827 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.525848 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.525860 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.628633 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.628693 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.628703 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.628723 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.628735 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.731892 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.731981 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.732014 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.732049 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.732073 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.836551 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.836624 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.836642 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.836674 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.836701 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.871104 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.871190 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:56 crc kubenswrapper[4990]: E1003 09:44:56.871323 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:56 crc kubenswrapper[4990]: E1003 09:44:56.871637 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.940392 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.940478 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.940497 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.940551 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:56 crc kubenswrapper[4990]: I1003 09:44:56.940578 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:56Z","lastTransitionTime":"2025-10-03T09:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.044009 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.044079 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.044102 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.044135 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.044160 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.146765 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.146823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.146834 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.146853 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.146865 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.249715 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.249768 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.249780 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.249801 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.249814 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.353113 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.353154 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.353163 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.353180 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.353189 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.456243 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.456307 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.456325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.456349 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.456368 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.558901 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.558951 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.558968 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.558989 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.559005 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.661593 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.661631 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.661639 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.661657 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.661666 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.764064 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.764103 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.764112 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.764128 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.764138 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.866749 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.866780 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.866791 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.866807 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.866815 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.871368 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.871409 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:57 crc kubenswrapper[4990]: E1003 09:44:57.871468 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:57 crc kubenswrapper[4990]: E1003 09:44:57.871584 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.969940 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.969988 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.970001 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.970023 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:57 crc kubenswrapper[4990]: I1003 09:44:57.970034 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:57Z","lastTransitionTime":"2025-10-03T09:44:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.073165 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.073246 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.073270 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.073299 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.073323 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.176486 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.176581 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.176596 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.176622 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.176636 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.280171 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.280259 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.280282 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.280318 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.280341 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.383240 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.383279 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.383288 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.383305 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.383315 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.486870 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.486917 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.486930 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.486949 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.486962 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.589940 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.590009 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.590030 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.590054 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.590072 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.693013 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.693089 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.693105 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.693128 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.693167 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.796226 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.796283 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.796296 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.796317 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.796331 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.871663 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.871783 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:44:58 crc kubenswrapper[4990]: E1003 09:44:58.871897 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:44:58 crc kubenswrapper[4990]: E1003 09:44:58.872106 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.883179 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.887248 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecb5c3e6-1d7e-43d1-8256-971ab553b87c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27063fb2071619a3acf8503729b830abc2535afdf0606f3f82bed20973ff51ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://139c771f90613d053f3c3a4603e0086271a2d1d001bfa0a63c024537e5e52423\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73d483af094d9230088e543f05f65b594a04cf9667186f2098fccc7ee9052c90\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3067e282f299f1a1b3f85e950af4cf91fa93cd954cec9f7f5d73c92b2397b6f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.899246 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.899289 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.899301 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.899318 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.899331 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:58Z","lastTransitionTime":"2025-10-03T09:44:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.901879 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.918352 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.929307 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lrqf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f59e4b-517b-444e-8df2-5b8dae4d5d67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08928dbc2fc590948e15c7425b85a4654fee2aacf93fc62f9c40d41ca2afcb82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nchtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lrqf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.939496 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-j96ms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56ed72a9-69d0-4f5e-b38b-f91c1221c917\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dc701c01fa34eceb8beedfa3f35bc70e5697c14df5763d5e379e1cd5a6386cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swlfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:01Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-j96ms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.950955 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a29e92f6-66b6-445b-b7c2-a708c69f6c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://955a2cdb3461b243e4afa1da83a4364d019ba228fe0dd9dc35e1e8fda284231a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5367a9cd2c2476ed8cc723cef1a016bd4280a566cbc3f977ab9d27c6d6e86485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2x8zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-l8sx4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.963585 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:58 crc kubenswrapper[4990]: I1003 09:44:58.979036 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f21ea38c-26da-4987-a50d-bafecdfbbd02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a142326a739f0798a3dd05566e75a0c45c180563596cc3dd50df804e883f3da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bmw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-68v62\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.000360 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:49Z\\\",\\\"message\\\":\\\"e-controller-manager-crc\\\\nI1003 09:44:49.836611 6998 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 09:44:49.836618 6998 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 09:44:49.836615 6998 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-lrqf5\\\\nF1003 09:44:49.836628 6998 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:49Z is after 2025-08-24T17:21:41Z]\\\\nI1003 09:44:49.836634 6998 obj_retry.go:303] Retry object setup: *v1.\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dr7pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7rqmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:58Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.001597 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.001629 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.001638 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.001652 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.001662 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.011381 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2a21582-ac04-4caa-a823-7c30c7f788c9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66shp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:44:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gdrcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:59Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.022586 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cfa0a1ee9e63c02e3571f3058d2dd107e9765cb5e188238dbecaa198e6dd852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a78f7479f4b747a7246dbcc48ce3fde4a9ef7542f6c24f6c4f16d03e2a16a15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:59Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.033858 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ed96813-18cb-4b58-aac1-14c13502747a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bf22ae01fd55b8544788a60512d1eb016c26962a1af6d76dff64b9fbfd24484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc26c08995e18e31cce7cc1908709f9db6187acb61ab452edbbe7187f8f870b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02eccbc71f61c9ce070f7454c59b11e3849068c272c8e7c2e276aacfb31ef229\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3202e7aae7698370c36b9177189820db508540909638aeb0ba426d9dbde00197\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e1d4bc4a94d3578fccbcfc58ee5f5d641197f5b47f6dac59f8f972efa53758\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 09:43:58.580372 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 09:43:58.580496 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 09:43:58.581179 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2144983803/tls.crt::/tmp/serving-cert-2144983803/tls.key\\\\\\\"\\\\nI1003 09:43:59.013867 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 09:43:59.021373 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 09:43:59.021406 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 09:43:59.021429 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 09:43:59.021436 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 09:43:59.027858 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 09:43:59.027886 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027892 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 09:43:59.027899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 09:43:59.027902 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 09:43:59.027905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 09:43:59.027908 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 09:43:59.029590 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 09:43:59.029741 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16075872890176a145e32bff0b8cd75ec290320583646eec9667f62192a9368\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19933b7d020e7710cff603b9dbeb6a4c64ea39e4fed199324522a535c30a8b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:59Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.044120 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d910881b0d3e6ef227ffb8a29ea62ee15e50888ee193e8073485f769c5876d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:59Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.056282 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb0d603ac0f54ff675492a150207ad5a3e9c5399e998e4d1f6a3598d68f415d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:59Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.067317 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bspdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31671a76-378e-4899-89ae-d27e608c3cda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T09:44:47Z\\\",\\\"message\\\":\\\"2025-10-03T09:44:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e\\\\n2025-10-03T09:44:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_227fdec5-30fd-47a7-b4f4-5a0aeefeb46e to /host/opt/cni/bin/\\\\n2025-10-03T09:44:02Z [verbose] multus-daemon started\\\\n2025-10-03T09:44:02Z [verbose] Readiness Indicator file check\\\\n2025-10-03T09:44:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjq26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bspdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:59Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.077252 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a73bfa-3370-4519-b458-d5a1ea7ec2f7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:43:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16308965a7402c99bc4debdfa0d68ae23bdc6d9eb519d9bac6c5534a26266653\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e3d617a40ffb82ec7973a2c841a498f7a844cb2500e981a70cdef226235b60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6e40ff28d58ac07ff89d77410cd7b1ab5327f1400c9027917288c501d62236f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:43:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d53f4ec10e8c9b0e210e6bca17c9c660a8a8d98d1ddfd547e1f1fec934254e91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:43:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:43:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:59Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.089656 4990 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb69z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aac0cf74-c31a-4c75-8810-556b8e787c9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T09:44:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f3254b7acbab85141fcdbc0276cb5fe7ccff12de7e86e1d685151ec48f512b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T09:44:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e69d984a1c78becd471c76547092c012bdbc2ab66dd00a3e215e067dc0d0cf62\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05f84b79b45e73ac20fc654104b0769fc993ca9e59938d1ce5e6799be624f4d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a832548a83aafcc23788ccbe0c3ea7554d225f78bf3a4731cf18bfb193e91bbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7407bbf557f2560b37f26116883b5471934213e556866ab34e4ed7d77be48d26\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f400ea828fb3c00fd20af2e12aeb4008bb40f8170563f6943339cc2b6a61af01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbba1fae1e8ce61c734674f10cdb154f02edeb11f2387658e1a484f65d99e1ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T09:44:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T09:44:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r6ppc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T09:43:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb69z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T09:44:59Z is after 2025-08-24T17:21:41Z" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.104694 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.104750 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.104759 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.104773 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.104798 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.207241 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.207295 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.207307 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.207325 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.207342 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.310007 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.310056 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.310069 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.310087 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.310099 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.413028 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.413086 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.413100 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.413118 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.413131 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.516375 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.516427 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.516438 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.516461 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.516475 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.620045 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.620354 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.620469 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.620575 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.620665 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.722662 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.722714 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.722775 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.722797 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.722811 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.825965 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.826292 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.826390 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.826496 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.826811 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.871107 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:44:59 crc kubenswrapper[4990]: E1003 09:44:59.871283 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.871638 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:44:59 crc kubenswrapper[4990]: E1003 09:44:59.871860 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.930469 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.930530 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.930541 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.930559 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:44:59 crc kubenswrapper[4990]: I1003 09:44:59.930571 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:44:59Z","lastTransitionTime":"2025-10-03T09:44:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.033704 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.033762 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.033774 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.033794 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.033808 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.142450 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.142499 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.142544 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.142574 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.142588 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.247016 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.247071 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.247081 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.247101 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.247113 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.350363 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.350402 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.350412 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.350431 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.350446 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.454018 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.454080 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.454092 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.454127 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.454138 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.558307 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.558375 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.558447 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.558476 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.558491 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.661797 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.661863 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.661876 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.661895 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.661908 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.766951 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.767009 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.767023 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.767045 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.767057 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.870395 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.870480 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.870502 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.870582 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.870604 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.870892 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:00 crc kubenswrapper[4990]: E1003 09:45:00.871037 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.871064 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:00 crc kubenswrapper[4990]: E1003 09:45:00.871258 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.973790 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.973844 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.973854 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.973873 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:00 crc kubenswrapper[4990]: I1003 09:45:00.973885 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:00Z","lastTransitionTime":"2025-10-03T09:45:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.076297 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.076334 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.076345 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.076363 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.076374 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.179189 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.179225 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.179244 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.179261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.179273 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.281611 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.281659 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.281670 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.281684 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.281694 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.385132 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.385174 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.385184 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.385201 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.385214 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.487408 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.487457 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.487472 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.487491 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.487534 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.590870 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.590923 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.590934 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.590950 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.590961 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.694907 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.694983 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.695004 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.695035 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.695056 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.798707 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.798797 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.798819 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.798846 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.798864 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.871478 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.871487 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:01 crc kubenswrapper[4990]: E1003 09:45:01.871648 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:01 crc kubenswrapper[4990]: E1003 09:45:01.871763 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.902278 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.902320 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.902331 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.902351 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:01 crc kubenswrapper[4990]: I1003 09:45:01.902362 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:01Z","lastTransitionTime":"2025-10-03T09:45:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.004602 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.004646 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.004656 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.004669 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.004679 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.107861 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.107896 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.107905 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.107919 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.107928 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.210362 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.210426 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.210442 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.210465 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.210480 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.313572 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.313632 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.313646 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.313670 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.313687 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.415987 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.416040 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.416052 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.416072 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.416088 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.519353 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.519401 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.519414 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.519436 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.519453 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.623172 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.623226 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.623237 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.623260 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.623273 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.726724 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.726813 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.726833 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.726856 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.726874 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.829972 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.830020 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.830028 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.830043 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.830052 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.871639 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.871753 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:02 crc kubenswrapper[4990]: E1003 09:45:02.871894 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:02 crc kubenswrapper[4990]: E1003 09:45:02.872035 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.933598 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.933670 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.933694 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.933725 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:02 crc kubenswrapper[4990]: I1003 09:45:02.933746 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:02Z","lastTransitionTime":"2025-10-03T09:45:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.036744 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.036803 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.036823 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.036847 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.036864 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.140078 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.140145 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.140168 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.140200 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.140223 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.243747 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.243825 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.243850 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.243886 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.243909 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.346023 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.346094 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.346105 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.346120 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.346132 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.448415 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.448469 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.448482 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.448502 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.448543 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.550734 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.550779 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.550788 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.550804 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.550814 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.653906 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.653959 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.653976 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.654006 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.654021 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.756968 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.757019 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.757036 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.757058 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.757073 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.838251 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.838403 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:46:07.838372418 +0000 UTC m=+149.635004305 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.838474 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.838614 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.838674 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.838745 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.838768 4990 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.838868 4990 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.838927 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.838942 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.838976 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.838991 4990 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.839005 4990 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.839017 4990 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.838875 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:46:07.838846191 +0000 UTC m=+149.635478098 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.839107 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 09:46:07.839082327 +0000 UTC m=+149.635714224 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.839152 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 09:46:07.839134228 +0000 UTC m=+149.635766125 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.839189 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 09:46:07.839173289 +0000 UTC m=+149.635805186 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.860916 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.860980 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.861005 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.861036 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.861060 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.871421 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.871423 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.871678 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:03 crc kubenswrapper[4990]: E1003 09:45:03.871989 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.963503 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.963610 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.963628 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.963649 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:03 crc kubenswrapper[4990]: I1003 09:45:03.963663 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:03Z","lastTransitionTime":"2025-10-03T09:45:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.066211 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.066284 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.066306 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.066326 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.066340 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.169227 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.169299 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.169308 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.169328 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.169343 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.272370 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.272406 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.272416 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.272435 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.272446 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.375435 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.375491 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.375502 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.375533 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.375545 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.478452 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.478525 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.478538 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.478555 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.478567 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.581586 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.581657 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.581667 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.581688 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.581704 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.684280 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.684330 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.684342 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.684360 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.684371 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.787261 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.787302 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.787313 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.787332 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.787345 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.870749 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:04 crc kubenswrapper[4990]: E1003 09:45:04.870922 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.871354 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:04 crc kubenswrapper[4990]: E1003 09:45:04.871708 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.890565 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.890614 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.890623 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.890640 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.890650 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.993599 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.993644 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.993656 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.993672 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:04 crc kubenswrapper[4990]: I1003 09:45:04.993682 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:04Z","lastTransitionTime":"2025-10-03T09:45:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.096650 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.096699 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.096712 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.096731 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.096743 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.202210 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.202284 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.202301 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.202330 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.202353 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.305686 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.305729 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.305738 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.305751 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.305761 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.408397 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.408433 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.408445 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.408461 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.408470 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.510846 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.510886 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.510903 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.510922 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.510941 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.613788 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.613834 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.613846 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.613864 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.613878 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.716348 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.716387 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.716398 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.716415 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.716427 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.820054 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.820116 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.820133 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.820152 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.820163 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.870896 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.871218 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:05 crc kubenswrapper[4990]: E1003 09:45:05.871292 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:05 crc kubenswrapper[4990]: E1003 09:45:05.871437 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.872147 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:45:05 crc kubenswrapper[4990]: E1003 09:45:05.872403 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.922449 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.922496 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.922538 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.922556 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:05 crc kubenswrapper[4990]: I1003 09:45:05.922569 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:05Z","lastTransitionTime":"2025-10-03T09:45:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.025108 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.025167 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.025182 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.025198 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.025209 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:06Z","lastTransitionTime":"2025-10-03T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.127499 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.127555 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.127563 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.127580 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.127588 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:06Z","lastTransitionTime":"2025-10-03T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.230162 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.230201 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.230209 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.230223 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.230232 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:06Z","lastTransitionTime":"2025-10-03T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.333482 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.333546 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.333556 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.333571 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.333580 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:06Z","lastTransitionTime":"2025-10-03T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.418815 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.418856 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.418865 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.418880 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.418889 4990 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T09:45:06Z","lastTransitionTime":"2025-10-03T09:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.467078 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq"] Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.467533 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.469322 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.469607 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.470571 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.471997 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.512334 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podStartSLOduration=67.512309499 podStartE2EDuration="1m7.512309499s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.512302349 +0000 UTC m=+88.308934206" watchObservedRunningTime="2025-10-03 09:45:06.512309499 +0000 UTC m=+88.308941366" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.565809 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75dee4b4-62bb-48da-a22d-57e564574d09-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.566418 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/75dee4b4-62bb-48da-a22d-57e564574d09-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.566554 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75dee4b4-62bb-48da-a22d-57e564574d09-service-ca\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.566627 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75dee4b4-62bb-48da-a22d-57e564574d09-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.566719 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/75dee4b4-62bb-48da-a22d-57e564574d09-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.576072 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=31.576049239 podStartE2EDuration="31.576049239s" podCreationTimestamp="2025-10-03 09:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.562616249 +0000 UTC m=+88.359248106" watchObservedRunningTime="2025-10-03 09:45:06.576049239 +0000 UTC m=+88.372681096" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.576215 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=66.576211943 podStartE2EDuration="1m6.576211943s" podCreationTimestamp="2025-10-03 09:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.575856734 +0000 UTC m=+88.372488591" watchObservedRunningTime="2025-10-03 09:45:06.576211943 +0000 UTC m=+88.372843800" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.629558 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bspdz" podStartSLOduration=67.629536602 podStartE2EDuration="1m7.629536602s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.619684026 +0000 UTC m=+88.416315913" watchObservedRunningTime="2025-10-03 09:45:06.629536602 +0000 UTC m=+88.426168489" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.647495 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.647473489 podStartE2EDuration="8.647473489s" podCreationTimestamp="2025-10-03 09:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.630147828 +0000 UTC m=+88.426779695" watchObservedRunningTime="2025-10-03 09:45:06.647473489 +0000 UTC m=+88.444105346" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.660861 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-l8sx4" podStartSLOduration=67.660838407 podStartE2EDuration="1m7.660838407s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.660772866 +0000 UTC m=+88.457404753" watchObservedRunningTime="2025-10-03 09:45:06.660838407 +0000 UTC m=+88.457470264" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.661209 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gb69z" podStartSLOduration=67.661202787 podStartE2EDuration="1m7.661202787s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.649259096 +0000 UTC m=+88.445890963" watchObservedRunningTime="2025-10-03 09:45:06.661202787 +0000 UTC m=+88.457834654" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.667475 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75dee4b4-62bb-48da-a22d-57e564574d09-service-ca\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.667783 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/75dee4b4-62bb-48da-a22d-57e564574d09-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.667859 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/75dee4b4-62bb-48da-a22d-57e564574d09-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.667887 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75dee4b4-62bb-48da-a22d-57e564574d09-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.668061 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75dee4b4-62bb-48da-a22d-57e564574d09-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.668139 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/75dee4b4-62bb-48da-a22d-57e564574d09-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.668245 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/75dee4b4-62bb-48da-a22d-57e564574d09-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.668439 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75dee4b4-62bb-48da-a22d-57e564574d09-service-ca\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.677181 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75dee4b4-62bb-48da-a22d-57e564574d09-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.684055 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75dee4b4-62bb-48da-a22d-57e564574d09-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-26dgq\" (UID: \"75dee4b4-62bb-48da-a22d-57e564574d09\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.704607 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.704581536 podStartE2EDuration="1m7.704581536s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.691856995 +0000 UTC m=+88.488488862" watchObservedRunningTime="2025-10-03 09:45:06.704581536 +0000 UTC m=+88.501213393" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.726478 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lrqf5" podStartSLOduration=68.726458406 podStartE2EDuration="1m8.726458406s" podCreationTimestamp="2025-10-03 09:43:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.725976154 +0000 UTC m=+88.522608011" watchObservedRunningTime="2025-10-03 09:45:06.726458406 +0000 UTC m=+88.523090263" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.785303 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" Oct 03 09:45:06 crc kubenswrapper[4990]: W1003 09:45:06.798281 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75dee4b4_62bb_48da_a22d_57e564574d09.slice/crio-3994728a9931c81a5ba217418b2ed669fe3cac5e7a26d7df256d179bc4b0d1f3 WatchSource:0}: Error finding container 3994728a9931c81a5ba217418b2ed669fe3cac5e7a26d7df256d179bc4b0d1f3: Status 404 returned error can't find the container with id 3994728a9931c81a5ba217418b2ed669fe3cac5e7a26d7df256d179bc4b0d1f3 Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.870971 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:06 crc kubenswrapper[4990]: I1003 09:45:06.870989 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:06 crc kubenswrapper[4990]: E1003 09:45:06.871092 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:06 crc kubenswrapper[4990]: E1003 09:45:06.871195 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:07 crc kubenswrapper[4990]: I1003 09:45:07.422383 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" event={"ID":"75dee4b4-62bb-48da-a22d-57e564574d09","Type":"ContainerStarted","Data":"4030943a43390d6bd79520439a4551423d70ed1812205716e9b0345fbbc4e944"} Oct 03 09:45:07 crc kubenswrapper[4990]: I1003 09:45:07.422442 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" event={"ID":"75dee4b4-62bb-48da-a22d-57e564574d09","Type":"ContainerStarted","Data":"3994728a9931c81a5ba217418b2ed669fe3cac5e7a26d7df256d179bc4b0d1f3"} Oct 03 09:45:07 crc kubenswrapper[4990]: I1003 09:45:07.437548 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-26dgq" podStartSLOduration=68.437521973 podStartE2EDuration="1m8.437521973s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:07.436791654 +0000 UTC m=+89.233423531" watchObservedRunningTime="2025-10-03 09:45:07.437521973 +0000 UTC m=+89.234153830" Oct 03 09:45:07 crc kubenswrapper[4990]: I1003 09:45:07.438274 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j96ms" podStartSLOduration=68.438264942 podStartE2EDuration="1m8.438264942s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:06.737959156 +0000 UTC m=+88.534591013" watchObservedRunningTime="2025-10-03 09:45:07.438264942 +0000 UTC m=+89.234896799" Oct 03 09:45:07 crc kubenswrapper[4990]: I1003 09:45:07.871248 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:07 crc kubenswrapper[4990]: I1003 09:45:07.871249 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:07 crc kubenswrapper[4990]: E1003 09:45:07.871419 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:07 crc kubenswrapper[4990]: E1003 09:45:07.871558 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:08 crc kubenswrapper[4990]: I1003 09:45:08.871352 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:08 crc kubenswrapper[4990]: I1003 09:45:08.875381 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:08 crc kubenswrapper[4990]: E1003 09:45:08.875285 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:08 crc kubenswrapper[4990]: E1003 09:45:08.876050 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:09 crc kubenswrapper[4990]: I1003 09:45:09.870818 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:09 crc kubenswrapper[4990]: I1003 09:45:09.870930 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:09 crc kubenswrapper[4990]: E1003 09:45:09.871031 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:09 crc kubenswrapper[4990]: E1003 09:45:09.871181 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:10 crc kubenswrapper[4990]: I1003 09:45:10.871173 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:10 crc kubenswrapper[4990]: I1003 09:45:10.871252 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:10 crc kubenswrapper[4990]: E1003 09:45:10.871336 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:10 crc kubenswrapper[4990]: E1003 09:45:10.871588 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:11 crc kubenswrapper[4990]: I1003 09:45:11.871221 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:11 crc kubenswrapper[4990]: I1003 09:45:11.871234 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:11 crc kubenswrapper[4990]: E1003 09:45:11.871392 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:11 crc kubenswrapper[4990]: E1003 09:45:11.871604 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:12 crc kubenswrapper[4990]: I1003 09:45:12.871675 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:12 crc kubenswrapper[4990]: I1003 09:45:12.871780 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:12 crc kubenswrapper[4990]: E1003 09:45:12.871846 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:12 crc kubenswrapper[4990]: E1003 09:45:12.871966 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:13 crc kubenswrapper[4990]: I1003 09:45:13.871465 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:13 crc kubenswrapper[4990]: I1003 09:45:13.871653 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:13 crc kubenswrapper[4990]: E1003 09:45:13.872047 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:13 crc kubenswrapper[4990]: E1003 09:45:13.872208 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:14 crc kubenswrapper[4990]: I1003 09:45:14.871707 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:14 crc kubenswrapper[4990]: I1003 09:45:14.871838 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:14 crc kubenswrapper[4990]: E1003 09:45:14.871865 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:14 crc kubenswrapper[4990]: E1003 09:45:14.872060 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:15 crc kubenswrapper[4990]: I1003 09:45:15.870958 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:15 crc kubenswrapper[4990]: I1003 09:45:15.871075 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:15 crc kubenswrapper[4990]: E1003 09:45:15.871207 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:15 crc kubenswrapper[4990]: E1003 09:45:15.871396 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:16 crc kubenswrapper[4990]: I1003 09:45:16.871694 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:16 crc kubenswrapper[4990]: E1003 09:45:16.871863 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:16 crc kubenswrapper[4990]: I1003 09:45:16.872681 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:16 crc kubenswrapper[4990]: I1003 09:45:16.872683 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:45:16 crc kubenswrapper[4990]: E1003 09:45:16.872974 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7rqmg_openshift-ovn-kubernetes(7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" Oct 03 09:45:16 crc kubenswrapper[4990]: E1003 09:45:16.872782 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:17 crc kubenswrapper[4990]: I1003 09:45:17.184218 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:17 crc kubenswrapper[4990]: E1003 09:45:17.184569 4990 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:45:17 crc kubenswrapper[4990]: E1003 09:45:17.184646 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs podName:b2a21582-ac04-4caa-a823-7c30c7f788c9 nodeName:}" failed. No retries permitted until 2025-10-03 09:46:21.184623465 +0000 UTC m=+162.981255322 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs") pod "network-metrics-daemon-gdrcw" (UID: "b2a21582-ac04-4caa-a823-7c30c7f788c9") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 09:45:17 crc kubenswrapper[4990]: I1003 09:45:17.871457 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:17 crc kubenswrapper[4990]: E1003 09:45:17.871633 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:17 crc kubenswrapper[4990]: I1003 09:45:17.871786 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:17 crc kubenswrapper[4990]: E1003 09:45:17.871950 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:18 crc kubenswrapper[4990]: I1003 09:45:18.871798 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:18 crc kubenswrapper[4990]: I1003 09:45:18.871900 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:18 crc kubenswrapper[4990]: E1003 09:45:18.872881 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:18 crc kubenswrapper[4990]: E1003 09:45:18.872977 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:19 crc kubenswrapper[4990]: I1003 09:45:19.871302 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:19 crc kubenswrapper[4990]: I1003 09:45:19.871311 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:19 crc kubenswrapper[4990]: E1003 09:45:19.871472 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:19 crc kubenswrapper[4990]: E1003 09:45:19.871589 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:20 crc kubenswrapper[4990]: I1003 09:45:20.870912 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:20 crc kubenswrapper[4990]: I1003 09:45:20.871071 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:20 crc kubenswrapper[4990]: E1003 09:45:20.871181 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:20 crc kubenswrapper[4990]: E1003 09:45:20.871422 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:21 crc kubenswrapper[4990]: I1003 09:45:21.870853 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:21 crc kubenswrapper[4990]: I1003 09:45:21.871006 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:21 crc kubenswrapper[4990]: E1003 09:45:21.871010 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:21 crc kubenswrapper[4990]: E1003 09:45:21.871228 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:21 crc kubenswrapper[4990]: I1003 09:45:21.886667 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 09:45:22 crc kubenswrapper[4990]: I1003 09:45:22.870992 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:22 crc kubenswrapper[4990]: I1003 09:45:22.871082 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:22 crc kubenswrapper[4990]: E1003 09:45:22.871148 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:22 crc kubenswrapper[4990]: E1003 09:45:22.871237 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:23 crc kubenswrapper[4990]: I1003 09:45:23.870808 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:23 crc kubenswrapper[4990]: I1003 09:45:23.870856 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:23 crc kubenswrapper[4990]: E1003 09:45:23.870929 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:23 crc kubenswrapper[4990]: E1003 09:45:23.870993 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:24 crc kubenswrapper[4990]: I1003 09:45:24.871570 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:24 crc kubenswrapper[4990]: E1003 09:45:24.871798 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:24 crc kubenswrapper[4990]: I1003 09:45:24.871595 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:24 crc kubenswrapper[4990]: E1003 09:45:24.871912 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:25 crc kubenswrapper[4990]: I1003 09:45:25.871012 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:25 crc kubenswrapper[4990]: I1003 09:45:25.871096 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:25 crc kubenswrapper[4990]: E1003 09:45:25.871168 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:25 crc kubenswrapper[4990]: E1003 09:45:25.871260 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:26 crc kubenswrapper[4990]: I1003 09:45:26.871137 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:26 crc kubenswrapper[4990]: I1003 09:45:26.871202 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:26 crc kubenswrapper[4990]: E1003 09:45:26.871428 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:26 crc kubenswrapper[4990]: E1003 09:45:26.871579 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:27 crc kubenswrapper[4990]: I1003 09:45:27.871100 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:27 crc kubenswrapper[4990]: I1003 09:45:27.871100 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:27 crc kubenswrapper[4990]: E1003 09:45:27.871251 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:27 crc kubenswrapper[4990]: E1003 09:45:27.871367 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:28 crc kubenswrapper[4990]: I1003 09:45:28.871261 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:28 crc kubenswrapper[4990]: I1003 09:45:28.871324 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:28 crc kubenswrapper[4990]: E1003 09:45:28.872563 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:28 crc kubenswrapper[4990]: E1003 09:45:28.872699 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:28 crc kubenswrapper[4990]: I1003 09:45:28.911087 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=7.911036242 podStartE2EDuration="7.911036242s" podCreationTimestamp="2025-10-03 09:45:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:28.91097206 +0000 UTC m=+110.707603937" watchObservedRunningTime="2025-10-03 09:45:28.911036242 +0000 UTC m=+110.707668099" Oct 03 09:45:29 crc kubenswrapper[4990]: I1003 09:45:29.870961 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:29 crc kubenswrapper[4990]: I1003 09:45:29.871139 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:29 crc kubenswrapper[4990]: E1003 09:45:29.871319 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:29 crc kubenswrapper[4990]: E1003 09:45:29.871471 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:30 crc kubenswrapper[4990]: I1003 09:45:30.871569 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:30 crc kubenswrapper[4990]: I1003 09:45:30.871584 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:30 crc kubenswrapper[4990]: I1003 09:45:30.872296 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:45:30 crc kubenswrapper[4990]: E1003 09:45:30.872422 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:30 crc kubenswrapper[4990]: E1003 09:45:30.872543 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:31 crc kubenswrapper[4990]: I1003 09:45:31.501473 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/3.log" Oct 03 09:45:31 crc kubenswrapper[4990]: I1003 09:45:31.504538 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerStarted","Data":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} Oct 03 09:45:31 crc kubenswrapper[4990]: I1003 09:45:31.505122 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:45:31 crc kubenswrapper[4990]: I1003 09:45:31.770434 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podStartSLOduration=92.770404862 podStartE2EDuration="1m32.770404862s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:31.533908583 +0000 UTC m=+113.330540470" watchObservedRunningTime="2025-10-03 09:45:31.770404862 +0000 UTC m=+113.567036759" Oct 03 09:45:31 crc kubenswrapper[4990]: I1003 09:45:31.772071 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gdrcw"] Oct 03 09:45:31 crc kubenswrapper[4990]: I1003 09:45:31.772200 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:31 crc kubenswrapper[4990]: E1003 09:45:31.772342 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:31 crc kubenswrapper[4990]: I1003 09:45:31.871592 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:31 crc kubenswrapper[4990]: I1003 09:45:31.871644 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:31 crc kubenswrapper[4990]: E1003 09:45:31.871752 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:31 crc kubenswrapper[4990]: E1003 09:45:31.871987 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:32 crc kubenswrapper[4990]: I1003 09:45:32.870994 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:32 crc kubenswrapper[4990]: E1003 09:45:32.871134 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 09:45:33 crc kubenswrapper[4990]: I1003 09:45:33.871458 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:33 crc kubenswrapper[4990]: I1003 09:45:33.871553 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:33 crc kubenswrapper[4990]: I1003 09:45:33.871480 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:33 crc kubenswrapper[4990]: E1003 09:45:33.871630 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gdrcw" podUID="b2a21582-ac04-4caa-a823-7c30c7f788c9" Oct 03 09:45:33 crc kubenswrapper[4990]: E1003 09:45:33.871730 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 09:45:33 crc kubenswrapper[4990]: E1003 09:45:33.871844 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.177079 4990 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.177256 4990 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.208145 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9k9dv"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.208696 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.211529 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.211852 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.212029 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.212235 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.212358 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5gscv"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.212495 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.212756 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.212985 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.213030 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.213293 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zjl7g"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.213701 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.214193 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.214752 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.215244 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kftbs"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.215720 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.230917 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.232180 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.237706 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.238885 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.246394 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.249876 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.250306 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.250709 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.250865 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.251126 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.251286 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.251486 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.251784 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.252358 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.252759 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.253001 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.254195 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.254811 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.254973 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255048 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255088 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255108 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255173 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255206 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255211 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255244 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255059 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255320 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255503 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255718 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255852 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.255739 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256178 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256250 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256299 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256468 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256615 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256659 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256704 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256789 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256833 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256790 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256950 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256981 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.256987 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.257574 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.257842 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.258144 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.261064 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s22jw"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.261581 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.261788 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.261875 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.264026 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.264460 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.265408 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.266478 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.266985 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-f5nfx"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.267204 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5jgtm"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.267551 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.267973 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.268225 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f5nfx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.268880 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.269430 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.269542 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.269733 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.269910 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.269927 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.270215 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.270349 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.272341 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.273428 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9k9dv"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.274139 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wnxdn"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.274808 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.275535 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.275730 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.276711 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277129 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277791 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9680507e-96dc-43bd-ade2-61985d923ddc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277827 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e4fbb0a8-9cb0-49b1-9e81-77c291382938-machine-approver-tls\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277856 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-config\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277879 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a6095bd-ea6a-4e7b-8504-89aa4704a720-audit-dir\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277904 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277930 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-encryption-config\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277952 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9680507e-96dc-43bd-ade2-61985d923ddc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277975 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4fbb0a8-9cb0-49b1-9e81-77c291382938-config\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.277999 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-oauth-config\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.278019 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-serving-cert\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.278040 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d64950-0314-46ae-9177-349a3f70404d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.278060 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-config\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.278082 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-serving-cert\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.278103 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4a6095bd-ea6a-4e7b-8504-89aa4704a720-node-pullsecrets\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.278124 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.285691 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.287397 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-service-ca-bundle\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.287836 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94t6m\" (UniqueName: \"kubernetes.io/projected/9680507e-96dc-43bd-ade2-61985d923ddc-kube-api-access-94t6m\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.287977 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-oauth-serving-cert\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.288157 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-trusted-ca-bundle\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.288298 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-config\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.288476 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61d64950-0314-46ae-9177-349a3f70404d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.288602 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3d6d60-b354-4999-a205-80a71688caec-serving-cert\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.288810 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-audit-policies\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.288952 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.289085 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-client-ca\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.289451 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.289741 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-image-import-ca\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293176 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc74c56c-b4ba-479b-87ba-ba707c62af66-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293265 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-etcd-serving-ca\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293316 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293339 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4fbb0a8-9cb0-49b1-9e81-77c291382938-auth-proxy-config\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293483 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-etcd-client\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293603 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txxmg\" (UniqueName: \"kubernetes.io/projected/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-kube-api-access-txxmg\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293747 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-config\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293789 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnsb5\" (UniqueName: \"kubernetes.io/projected/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-kube-api-access-hnsb5\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293817 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz688\" (UniqueName: \"kubernetes.io/projected/60f1ddd8-ebdb-4575-b06e-619cbe196937-kube-api-access-wz688\") pod \"downloads-7954f5f757-f5nfx\" (UID: \"60f1ddd8-ebdb-4575-b06e-619cbe196937\") " pod="openshift-console/downloads-7954f5f757-f5nfx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293957 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.293997 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-encryption-config\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.294090 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.294151 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-serving-cert\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.294401 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hmdg"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.294381 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpdbz\" (UniqueName: \"kubernetes.io/projected/650ac140-20eb-4503-bcf2-f9795cebaff2-kube-api-access-mpdbz\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.294902 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-service-ca\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.294997 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8637f893-27db-425a-85aa-8d8d5c6e1dba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2br82\" (UID: \"8637f893-27db-425a-85aa-8d8d5c6e1dba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295080 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc74c56c-b4ba-479b-87ba-ba707c62af66-config\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295135 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-289mf\" (UniqueName: \"kubernetes.io/projected/e4fbb0a8-9cb0-49b1-9e81-77c291382938-kube-api-access-289mf\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295172 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-serving-cert\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295221 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-serving-cert\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295321 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295362 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-audit\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295438 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ztc\" (UniqueName: \"kubernetes.io/projected/7e3d6d60-b354-4999-a205-80a71688caec-kube-api-access-x8ztc\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295638 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr77n\" (UniqueName: \"kubernetes.io/projected/4a6095bd-ea6a-4e7b-8504-89aa4704a720-kube-api-access-cr77n\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295755 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-config\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295775 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-client-ca\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295833 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0450636d-e918-475a-af84-690cac4baa47-serving-cert\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295942 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-etcd-client\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.295999 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxrr\" (UniqueName: \"kubernetes.io/projected/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-kube-api-access-nnxrr\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.296069 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.296095 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bc74c56c-b4ba-479b-87ba-ba707c62af66-images\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.296130 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g29xw\" (UniqueName: \"kubernetes.io/projected/0450636d-e918-475a-af84-690cac4baa47-kube-api-access-g29xw\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.296225 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/650ac140-20eb-4503-bcf2-f9795cebaff2-audit-dir\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.296270 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjh7\" (UniqueName: \"kubernetes.io/projected/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-kube-api-access-mgjh7\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.297100 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k48m\" (UniqueName: \"kubernetes.io/projected/bc74c56c-b4ba-479b-87ba-ba707c62af66-kube-api-access-2k48m\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.297185 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktflm\" (UniqueName: \"kubernetes.io/projected/61d64950-0314-46ae-9177-349a3f70404d-kube-api-access-ktflm\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.297212 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fhv\" (UniqueName: \"kubernetes.io/projected/8637f893-27db-425a-85aa-8d8d5c6e1dba-kube-api-access-85fhv\") pod \"cluster-samples-operator-665b6dd947-2br82\" (UID: \"8637f893-27db-425a-85aa-8d8d5c6e1dba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.297212 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.300268 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.300383 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.300586 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-llj5c"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.306009 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.306293 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.306428 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.306552 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.306663 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.306897 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.307006 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.307295 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.307456 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.307539 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.310019 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.312804 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.313229 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.313465 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.313725 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.313953 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.314207 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.314501 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.314654 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.314772 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.314599 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.315054 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.314920 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.314966 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.315824 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.315961 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.316078 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.315921 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.316403 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.317016 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.317472 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.317699 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.317867 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.318243 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.318350 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.319181 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.320522 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zr9m5"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.321079 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.326658 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qvmfh"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.327601 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5gscv"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.327823 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.327977 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.328114 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.328448 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.328743 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hdl6b"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.329285 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.329681 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.330120 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.330496 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.331294 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.333772 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.334011 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.334962 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.336489 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.340967 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.343658 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.348883 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.353293 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.353489 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qn5kr"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.372358 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.373624 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.376293 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.376440 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.376500 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.381791 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.382847 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.383901 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.384106 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.385223 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.389878 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.391533 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.395538 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400307 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-trusted-ca-bundle\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400356 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-config\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400387 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61d64950-0314-46ae-9177-349a3f70404d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400409 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3d6d60-b354-4999-a205-80a71688caec-serving-cert\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400430 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-audit-policies\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400455 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400478 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-client-ca\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400498 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400543 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85165410-2d0a-4af1-a60f-a1374e58c3f1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400569 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-image-import-ca\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400594 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc74c56c-b4ba-479b-87ba-ba707c62af66-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400614 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-etcd-serving-ca\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400643 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400684 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4fbb0a8-9cb0-49b1-9e81-77c291382938-auth-proxy-config\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400707 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-etcd-client\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400730 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nwnz\" (UniqueName: \"kubernetes.io/projected/554eeba0-898e-4fd2-9308-2e3028b4e8ae-kube-api-access-2nwnz\") pod \"dns-operator-744455d44c-llj5c\" (UID: \"554eeba0-898e-4fd2-9308-2e3028b4e8ae\") " pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400755 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4ebd5bf8-15f2-4538-b8ad-08504265e855-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qn5kr\" (UID: \"4ebd5bf8-15f2-4538-b8ad-08504265e855\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400778 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl4kf\" (UniqueName: \"kubernetes.io/projected/4ebd5bf8-15f2-4538-b8ad-08504265e855-kube-api-access-zl4kf\") pod \"multus-admission-controller-857f4d67dd-qn5kr\" (UID: \"4ebd5bf8-15f2-4538-b8ad-08504265e855\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400816 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-config\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400880 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txxmg\" (UniqueName: \"kubernetes.io/projected/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-kube-api-access-txxmg\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400931 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-ca\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400966 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnsb5\" (UniqueName: \"kubernetes.io/projected/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-kube-api-access-hnsb5\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.400998 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz688\" (UniqueName: \"kubernetes.io/projected/60f1ddd8-ebdb-4575-b06e-619cbe196937-kube-api-access-wz688\") pod \"downloads-7954f5f757-f5nfx\" (UID: \"60f1ddd8-ebdb-4575-b06e-619cbe196937\") " pod="openshift-console/downloads-7954f5f757-f5nfx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401049 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401076 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-encryption-config\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401117 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401168 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-serving-cert\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401193 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85165410-2d0a-4af1-a60f-a1374e58c3f1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401218 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpdbz\" (UniqueName: \"kubernetes.io/projected/650ac140-20eb-4503-bcf2-f9795cebaff2-kube-api-access-mpdbz\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401252 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-service-ca\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401276 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8637f893-27db-425a-85aa-8d8d5c6e1dba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2br82\" (UID: \"8637f893-27db-425a-85aa-8d8d5c6e1dba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401304 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc74c56c-b4ba-479b-87ba-ba707c62af66-config\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401330 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-289mf\" (UniqueName: \"kubernetes.io/projected/e4fbb0a8-9cb0-49b1-9e81-77c291382938-kube-api-access-289mf\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401383 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-serving-cert\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401414 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-serving-cert\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401441 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-serving-cert\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401480 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401528 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-audit\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401552 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ztc\" (UniqueName: \"kubernetes.io/projected/7e3d6d60-b354-4999-a205-80a71688caec-kube-api-access-x8ztc\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401580 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr77n\" (UniqueName: \"kubernetes.io/projected/4a6095bd-ea6a-4e7b-8504-89aa4704a720-kube-api-access-cr77n\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401609 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-config\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401637 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-client-ca\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401660 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0450636d-e918-475a-af84-690cac4baa47-serving-cert\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401686 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-etcd-client\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401711 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxrr\" (UniqueName: \"kubernetes.io/projected/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-kube-api-access-nnxrr\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401743 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401931 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-audit-policies\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401748 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402037 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bc74c56c-b4ba-479b-87ba-ba707c62af66-images\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402073 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g29xw\" (UniqueName: \"kubernetes.io/projected/0450636d-e918-475a-af84-690cac4baa47-kube-api-access-g29xw\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402109 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-config\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402200 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/650ac140-20eb-4503-bcf2-f9795cebaff2-audit-dir\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402230 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjh7\" (UniqueName: \"kubernetes.io/projected/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-kube-api-access-mgjh7\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402255 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-client\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402288 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k48m\" (UniqueName: \"kubernetes.io/projected/bc74c56c-b4ba-479b-87ba-ba707c62af66-kube-api-access-2k48m\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402308 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402314 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktflm\" (UniqueName: \"kubernetes.io/projected/61d64950-0314-46ae-9177-349a3f70404d-kube-api-access-ktflm\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402384 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85fhv\" (UniqueName: \"kubernetes.io/projected/8637f893-27db-425a-85aa-8d8d5c6e1dba-kube-api-access-85fhv\") pod \"cluster-samples-operator-665b6dd947-2br82\" (UID: \"8637f893-27db-425a-85aa-8d8d5c6e1dba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402416 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9680507e-96dc-43bd-ade2-61985d923ddc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402446 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e4fbb0a8-9cb0-49b1-9e81-77c291382938-machine-approver-tls\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402480 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-config\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402549 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a6095bd-ea6a-4e7b-8504-89aa4704a720-audit-dir\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402576 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402598 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/650ac140-20eb-4503-bcf2-f9795cebaff2-audit-dir\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402599 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-serving-cert\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402646 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-encryption-config\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402666 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9680507e-96dc-43bd-ade2-61985d923ddc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402686 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4fbb0a8-9cb0-49b1-9e81-77c291382938-config\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402704 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-oauth-config\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.402722 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d64950-0314-46ae-9177-349a3f70404d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.403950 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-prdsm"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404005 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bc74c56c-b4ba-479b-87ba-ba707c62af66-images\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404574 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404766 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-config\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404817 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-serving-cert\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404843 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4a6095bd-ea6a-4e7b-8504-89aa4704a720-node-pullsecrets\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404860 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404883 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/554eeba0-898e-4fd2-9308-2e3028b4e8ae-metrics-tls\") pod \"dns-operator-744455d44c-llj5c\" (UID: \"554eeba0-898e-4fd2-9308-2e3028b4e8ae\") " pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404906 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-service-ca-bundle\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404928 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85165410-2d0a-4af1-a60f-a1374e58c3f1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404964 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p89dk\" (UniqueName: \"kubernetes.io/projected/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-kube-api-access-p89dk\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.404985 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-service-ca\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.405033 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94t6m\" (UniqueName: \"kubernetes.io/projected/9680507e-96dc-43bd-ade2-61985d923ddc-kube-api-access-94t6m\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.405062 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-oauth-serving-cert\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.405091 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.405065 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.405996 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4fbb0a8-9cb0-49b1-9e81-77c291382938-config\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.406344 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-trusted-ca-bundle\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.407174 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.407705 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d64950-0314-46ae-9177-349a3f70404d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.407860 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-image-import-ca\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.408340 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-config\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.408577 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3d6d60-b354-4999-a205-80a71688caec-serving-cert\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.408579 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-client-ca\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.409143 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61d64950-0314-46ae-9177-349a3f70404d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.409304 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-config\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.409427 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4a6095bd-ea6a-4e7b-8504-89aa4704a720-node-pullsecrets\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.410595 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.411215 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4fbb0a8-9cb0-49b1-9e81-77c291382938-auth-proxy-config\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.411687 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc74c56c-b4ba-479b-87ba-ba707c62af66-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.411809 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-serving-cert\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.411947 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-oauth-config\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.412298 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.412553 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.412998 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-service-ca\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.413203 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-service-ca-bundle\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.413225 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/650ac140-20eb-4503-bcf2-f9795cebaff2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.413798 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-etcd-serving-ca\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.413922 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-oauth-serving-cert\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.414367 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-config\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.401408 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-config\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.414657 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-client-ca\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.417094 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc74c56c-b4ba-479b-87ba-ba707c62af66-config\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.417315 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-encryption-config\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.418440 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-config\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.418782 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-audit\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.421361 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-serving-cert\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.421636 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a6095bd-ea6a-4e7b-8504-89aa4704a720-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.421759 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kftbs"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.421801 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.422589 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-serving-cert\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.422640 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4a6095bd-ea6a-4e7b-8504-89aa4704a720-audit-dir\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.422959 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-serving-cert\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.423319 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.425236 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.426487 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0450636d-e918-475a-af84-690cac4baa47-serving-cert\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.427045 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s22jw"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.432146 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f5nfx"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.429102 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e4fbb0a8-9cb0-49b1-9e81-77c291382938-machine-approver-tls\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.430400 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8637f893-27db-425a-85aa-8d8d5c6e1dba-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2br82\" (UID: \"8637f893-27db-425a-85aa-8d8d5c6e1dba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.431345 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.432066 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.428903 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9680507e-96dc-43bd-ade2-61985d923ddc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.433693 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.433697 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.434855 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.438775 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-etcd-client\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.440164 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5jgtm"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.440398 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4a6095bd-ea6a-4e7b-8504-89aa4704a720-encryption-config\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.440472 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-serving-cert\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.442465 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.442783 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/650ac140-20eb-4503-bcf2-f9795cebaff2-etcd-client\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.443342 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rwrlt"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.443998 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.444835 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.446243 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9680507e-96dc-43bd-ade2-61985d923ddc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.447931 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zjl7g"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.450173 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-llj5c"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.451545 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.452691 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.453560 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wnxdn"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.454599 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-flvhx"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.455436 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.455470 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gsvpm"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.456452 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.456550 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.457591 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qvmfh"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.458578 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zr9m5"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.459655 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.461208 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.462133 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rwrlt"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.463583 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.464323 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sgmrx"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.464916 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sgmrx" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.465290 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.466855 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hmdg"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.467702 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.468909 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.470033 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.470806 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.471191 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.472174 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qn5kr"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.473830 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.475029 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.478209 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-prdsm"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.479284 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.481936 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.485179 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sgmrx"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.487557 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.490783 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.492886 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gsvpm"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.493170 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.494317 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.495645 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.498373 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-69dts"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.499991 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-69dts"] Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.500135 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.505875 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85165410-2d0a-4af1-a60f-a1374e58c3f1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.505910 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nwnz\" (UniqueName: \"kubernetes.io/projected/554eeba0-898e-4fd2-9308-2e3028b4e8ae-kube-api-access-2nwnz\") pod \"dns-operator-744455d44c-llj5c\" (UID: \"554eeba0-898e-4fd2-9308-2e3028b4e8ae\") " pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.505938 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4ebd5bf8-15f2-4538-b8ad-08504265e855-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qn5kr\" (UID: \"4ebd5bf8-15f2-4538-b8ad-08504265e855\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.505954 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl4kf\" (UniqueName: \"kubernetes.io/projected/4ebd5bf8-15f2-4538-b8ad-08504265e855-kube-api-access-zl4kf\") pod \"multus-admission-controller-857f4d67dd-qn5kr\" (UID: \"4ebd5bf8-15f2-4538-b8ad-08504265e855\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.505974 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-ca\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.506014 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85165410-2d0a-4af1-a60f-a1374e58c3f1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.506047 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-serving-cert\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.506100 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-config\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.506130 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-client\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.506174 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/554eeba0-898e-4fd2-9308-2e3028b4e8ae-metrics-tls\") pod \"dns-operator-744455d44c-llj5c\" (UID: \"554eeba0-898e-4fd2-9308-2e3028b4e8ae\") " pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.506193 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85165410-2d0a-4af1-a60f-a1374e58c3f1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.506218 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-service-ca\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.506235 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p89dk\" (UniqueName: \"kubernetes.io/projected/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-kube-api-access-p89dk\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.509754 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/554eeba0-898e-4fd2-9308-2e3028b4e8ae-metrics-tls\") pod \"dns-operator-744455d44c-llj5c\" (UID: \"554eeba0-898e-4fd2-9308-2e3028b4e8ae\") " pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.510856 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.531001 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.551727 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.571419 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.592772 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.612213 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.631082 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.658742 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.680874 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.691229 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.696855 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-ca\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.710857 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.731401 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.740708 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-serving-cert\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.752544 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.771248 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.791404 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.797193 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-config\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.812134 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.819767 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-client\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.831901 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.837606 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-etcd-service-ca\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.851821 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.871421 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.871757 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.891692 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.911887 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.932277 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.952065 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.971635 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 09:45:34 crc kubenswrapper[4990]: I1003 09:45:34.991866 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.011689 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.032266 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.041210 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85165410-2d0a-4af1-a60f-a1374e58c3f1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.051251 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.057385 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85165410-2d0a-4af1-a60f-a1374e58c3f1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.071699 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.091714 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.112042 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.141585 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.152169 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.171276 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.192714 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.211249 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.232127 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.251654 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.291818 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.311218 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.321138 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4ebd5bf8-15f2-4538-b8ad-08504265e855-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qn5kr\" (UID: \"4ebd5bf8-15f2-4538-b8ad-08504265e855\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.331463 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.351980 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.371950 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.429266 4990 request.go:700] Waited for 1.045428424s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.431631 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.431917 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.431986 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.451264 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.491027 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.491181 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.508802 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.511379 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.531155 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.550730 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.571390 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.592049 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.611487 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.632541 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.690965 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.690980 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.736910 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktflm\" (UniqueName: \"kubernetes.io/projected/61d64950-0314-46ae-9177-349a3f70404d-kube-api-access-ktflm\") pod \"openshift-apiserver-operator-796bbdcf4f-7xjqr\" (UID: \"61d64950-0314-46ae-9177-349a3f70404d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.752934 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjh7\" (UniqueName: \"kubernetes.io/projected/90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b-kube-api-access-mgjh7\") pod \"authentication-operator-69f744f599-kftbs\" (UID: \"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.769442 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k48m\" (UniqueName: \"kubernetes.io/projected/bc74c56c-b4ba-479b-87ba-ba707c62af66-kube-api-access-2k48m\") pod \"machine-api-operator-5694c8668f-9k9dv\" (UID: \"bc74c56c-b4ba-479b-87ba-ba707c62af66\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.771833 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.791719 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.812284 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.832394 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.853158 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.856866 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.871005 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.871021 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.871024 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.871128 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.881990 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.910919 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpdbz\" (UniqueName: \"kubernetes.io/projected/650ac140-20eb-4503-bcf2-f9795cebaff2-kube-api-access-mpdbz\") pod \"apiserver-7bbb656c7d-tjqxd\" (UID: \"650ac140-20eb-4503-bcf2-f9795cebaff2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.926168 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g29xw\" (UniqueName: \"kubernetes.io/projected/0450636d-e918-475a-af84-690cac4baa47-kube-api-access-g29xw\") pod \"route-controller-manager-6576b87f9c-jmd4s\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.954158 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txxmg\" (UniqueName: \"kubernetes.io/projected/f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6-kube-api-access-txxmg\") pod \"openshift-config-operator-7777fb866f-s22jw\" (UID: \"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.965156 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.968018 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnsb5\" (UniqueName: \"kubernetes.io/projected/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-kube-api-access-hnsb5\") pod \"console-f9d7485db-5jgtm\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.974406 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:35 crc kubenswrapper[4990]: I1003 09:45:35.992266 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz688\" (UniqueName: \"kubernetes.io/projected/60f1ddd8-ebdb-4575-b06e-619cbe196937-kube-api-access-wz688\") pod \"downloads-7954f5f757-f5nfx\" (UID: \"60f1ddd8-ebdb-4575-b06e-619cbe196937\") " pod="openshift-console/downloads-7954f5f757-f5nfx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.012826 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94t6m\" (UniqueName: \"kubernetes.io/projected/9680507e-96dc-43bd-ade2-61985d923ddc-kube-api-access-94t6m\") pod \"openshift-controller-manager-operator-756b6f6bc6-5n7fm\" (UID: \"9680507e-96dc-43bd-ade2-61985d923ddc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.027292 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fhv\" (UniqueName: \"kubernetes.io/projected/8637f893-27db-425a-85aa-8d8d5c6e1dba-kube-api-access-85fhv\") pod \"cluster-samples-operator-665b6dd947-2br82\" (UID: \"8637f893-27db-425a-85aa-8d8d5c6e1dba\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.047557 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.049061 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-289mf\" (UniqueName: \"kubernetes.io/projected/e4fbb0a8-9cb0-49b1-9e81-77c291382938-kube-api-access-289mf\") pod \"machine-approver-56656f9798-hqwkx\" (UID: \"e4fbb0a8-9cb0-49b1-9e81-77c291382938\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.064847 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kftbs"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.080521 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxrr\" (UniqueName: \"kubernetes.io/projected/53edcb9e-1d3f-48bf-a457-e1f3ec65bba5-kube-api-access-nnxrr\") pod \"cluster-image-registry-operator-dc59b4c8b-85xqf\" (UID: \"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:36 crc kubenswrapper[4990]: W1003 09:45:36.086742 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90f5fed6_bcc3_4af8_ac9c_4ff12fe9bc1b.slice/crio-066e8d5dd8e96c7b4c0976cf800b84b6a1a017d786dfec61d0a655746d90c935 WatchSource:0}: Error finding container 066e8d5dd8e96c7b4c0976cf800b84b6a1a017d786dfec61d0a655746d90c935: Status 404 returned error can't find the container with id 066e8d5dd8e96c7b4c0976cf800b84b6a1a017d786dfec61d0a655746d90c935 Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.090535 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.091069 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr77n\" (UniqueName: \"kubernetes.io/projected/4a6095bd-ea6a-4e7b-8504-89aa4704a720-kube-api-access-cr77n\") pod \"apiserver-76f77b778f-zjl7g\" (UID: \"4a6095bd-ea6a-4e7b-8504-89aa4704a720\") " pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.093465 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.107806 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ztc\" (UniqueName: \"kubernetes.io/projected/7e3d6d60-b354-4999-a205-80a71688caec-kube-api-access-x8ztc\") pod \"controller-manager-879f6c89f-5gscv\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.112526 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: W1003 09:45:36.113676 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d64950_0314_46ae_9177_349a3f70404d.slice/crio-e3481d7b619cfe10e64d27999048def11aaeb186ba27813ef7dadd3d78e90058 WatchSource:0}: Error finding container e3481d7b619cfe10e64d27999048def11aaeb186ba27813ef7dadd3d78e90058: Status 404 returned error can't find the container with id e3481d7b619cfe10e64d27999048def11aaeb186ba27813ef7dadd3d78e90058 Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.113926 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.123381 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.132406 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.152731 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.173718 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.192364 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.200809 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.212426 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.216373 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5jgtm"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.233583 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.255248 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.257054 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.272735 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.282300 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.290987 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.292720 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f5nfx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.296046 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.305177 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9k9dv"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.312229 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.332304 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.347066 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.352076 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.360145 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.371936 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.381220 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.392782 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.399713 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-s22jw"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.412256 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zjl7g"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.416034 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.429823 4990 request.go:700] Waited for 1.973120742s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.431930 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.451988 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.472414 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.475805 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.493785 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.512068 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.532358 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.540266 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" event={"ID":"bc74c56c-b4ba-479b-87ba-ba707c62af66","Type":"ContainerStarted","Data":"823f8261355100bb7dd7479181baaef91b22acdf0689d92c7c181821d90cd136"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.540937 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.543651 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" event={"ID":"61d64950-0314-46ae-9177-349a3f70404d","Type":"ContainerStarted","Data":"f5bead770574ea9b301882b2c45f4240f19f8e90a8445e5b7e7cb6743fd5ef36"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.543693 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" event={"ID":"61d64950-0314-46ae-9177-349a3f70404d","Type":"ContainerStarted","Data":"e3481d7b619cfe10e64d27999048def11aaeb186ba27813ef7dadd3d78e90058"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.553083 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.571135 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" event={"ID":"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b","Type":"ContainerStarted","Data":"73602a93a716a852539b1234aa988facbe48eef8c52cc7107caa7fe86c7ae5ae"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.571178 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" event={"ID":"90f5fed6-bcc3-4af8-ac9c-4ff12fe9bc1b","Type":"ContainerStarted","Data":"066e8d5dd8e96c7b4c0976cf800b84b6a1a017d786dfec61d0a655746d90c935"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.572085 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.573169 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" event={"ID":"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6","Type":"ContainerStarted","Data":"ebd2fc58e91e08317c3b9dea861b5bbfe1ff9085282abf09917b7083fcab7685"} Oct 03 09:45:36 crc kubenswrapper[4990]: W1003 09:45:36.573619 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9680507e_96dc_43bd_ade2_61985d923ddc.slice/crio-0d2008e0b99ed5b72ef874bf97f97bbedf2b852018833305510da9e776649ff7 WatchSource:0}: Error finding container 0d2008e0b99ed5b72ef874bf97f97bbedf2b852018833305510da9e776649ff7: Status 404 returned error can't find the container with id 0d2008e0b99ed5b72ef874bf97f97bbedf2b852018833305510da9e776649ff7 Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.574082 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" event={"ID":"0450636d-e918-475a-af84-690cac4baa47","Type":"ContainerStarted","Data":"93e1943934585663712dd796ae611441ed5a21e7edad0c86f9ea0458163f217c"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.575491 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" event={"ID":"650ac140-20eb-4503-bcf2-f9795cebaff2","Type":"ContainerStarted","Data":"fda191ff0b0eccbcaa6e3f502da99e45f49a454f782cf255286db6c6ca409ec1"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.591868 4990 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.604148 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5jgtm" event={"ID":"ee5a405d-75fe-4968-881c-62d8c6d0dd5a","Type":"ContainerStarted","Data":"304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.604194 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5jgtm" event={"ID":"ee5a405d-75fe-4968-881c-62d8c6d0dd5a","Type":"ContainerStarted","Data":"70a0ca07068b717c6349f61faf20f21b9a9819390e97bbabaea02c512c51bf31"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.607990 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" event={"ID":"e4fbb0a8-9cb0-49b1-9e81-77c291382938","Type":"ContainerStarted","Data":"5dccf76965e05902b1191cff49bf1d77967367241551e35e7f6025166ea95f04"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.609467 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" event={"ID":"4a6095bd-ea6a-4e7b-8504-89aa4704a720","Type":"ContainerStarted","Data":"f0d9d129211a751f8daf1f4592162bfa309c001fb9ffaa04002622cbfbbbc1fc"} Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.637439 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85165410-2d0a-4af1-a60f-a1374e58c3f1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m8h8\" (UID: \"85165410-2d0a-4af1-a60f-a1374e58c3f1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.645156 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nwnz\" (UniqueName: \"kubernetes.io/projected/554eeba0-898e-4fd2-9308-2e3028b4e8ae-kube-api-access-2nwnz\") pod \"dns-operator-744455d44c-llj5c\" (UID: \"554eeba0-898e-4fd2-9308-2e3028b4e8ae\") " pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.654364 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.661213 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.673110 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl4kf\" (UniqueName: \"kubernetes.io/projected/4ebd5bf8-15f2-4538-b8ad-08504265e855-kube-api-access-zl4kf\") pod \"multus-admission-controller-857f4d67dd-qn5kr\" (UID: \"4ebd5bf8-15f2-4538-b8ad-08504265e855\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" Oct 03 09:45:36 crc kubenswrapper[4990]: W1003 09:45:36.677178 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53edcb9e_1d3f_48bf_a457_e1f3ec65bba5.slice/crio-009ebeac5a0598149dd2c10707b13b35acd9096c5f373c0ca008b0457b0b50f4 WatchSource:0}: Error finding container 009ebeac5a0598149dd2c10707b13b35acd9096c5f373c0ca008b0457b0b50f4: Status 404 returned error can't find the container with id 009ebeac5a0598149dd2c10707b13b35acd9096c5f373c0ca008b0457b0b50f4 Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.691147 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p89dk\" (UniqueName: \"kubernetes.io/projected/d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2-kube-api-access-p89dk\") pod \"etcd-operator-b45778765-qvmfh\" (UID: \"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.692616 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.701236 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.712359 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f5nfx"] Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.714356 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 09:45:36 crc kubenswrapper[4990]: W1003 09:45:36.723553 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f1ddd8_ebdb_4575_b06e_619cbe196937.slice/crio-7171fef666b60c89a9bbe48236fc2b11c61eccdbe64a45d110809d08e3cf67d8 WatchSource:0}: Error finding container 7171fef666b60c89a9bbe48236fc2b11c61eccdbe64a45d110809d08e3cf67d8: Status 404 returned error can't find the container with id 7171fef666b60c89a9bbe48236fc2b11c61eccdbe64a45d110809d08e3cf67d8 Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758220 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a518ef04-5379-442a-89d6-f0ab745e35c6-proxy-tls\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758739 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758765 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e8c510-4ff7-4758-b452-bdcc3eb6023b-config\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758797 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-policies\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758813 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-metrics-certs\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758834 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758849 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758925 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-trusted-ca\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.758948 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-dir\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759026 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4qn\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-kube-api-access-jb4qn\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759043 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e02d43f-9df8-4896-b407-23fc963d956b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759060 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a6e6b23-3c66-42a4-8f01-c8605efe1412-serving-cert\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759075 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89m68\" (UniqueName: \"kubernetes.io/projected/509b877d-1d79-4882-96f7-725d81a000cd-kube-api-access-89m68\") pod \"migrator-59844c95c7-29mz5\" (UID: \"509b877d-1d79-4882-96f7-725d81a000cd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759092 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgmwj\" (UniqueName: \"kubernetes.io/projected/a518ef04-5379-442a-89d6-f0ab745e35c6-kube-api-access-vgmwj\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759128 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759146 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlf8\" (UniqueName: \"kubernetes.io/projected/cd1a02a7-9b7a-417f-a2b7-7421705a3010-kube-api-access-jjlf8\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759165 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a518ef04-5379-442a-89d6-f0ab745e35c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.759916 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-default-certificate\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: E1003 09:45:36.760561 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:37.26054178 +0000 UTC m=+119.057173737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.760222 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1469c562-1020-4d06-8018-6fd61392d855-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.762398 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-stats-auth\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.762462 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfcm2\" (UniqueName: \"kubernetes.io/projected/3a6e6b23-3c66-42a4-8f01-c8605efe1412-kube-api-access-dfcm2\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.762496 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1469c562-1020-4d06-8018-6fd61392d855-proxy-tls\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.762640 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7dn6\" (UniqueName: \"kubernetes.io/projected/1469c562-1020-4d06-8018-6fd61392d855-kube-api-access-t7dn6\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763105 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763175 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a518ef04-5379-442a-89d6-f0ab745e35c6-images\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763315 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/becdef88-76d3-402a-b26f-23a4cbdf1644-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763475 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a6e6b23-3c66-42a4-8f01-c8605efe1412-trusted-ca\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763528 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763595 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01e8c510-4ff7-4758-b452-bdcc3eb6023b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763630 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01e8c510-4ff7-4758-b452-bdcc3eb6023b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763660 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-tls\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.763718 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/becdef88-76d3-402a-b26f-23a4cbdf1644-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764098 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764129 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764149 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b93f3c26-69a5-4220-956a-2f6bbf884c9a-service-ca-bundle\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764199 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764216 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vdkw\" (UniqueName: \"kubernetes.io/projected/9e02d43f-9df8-4896-b407-23fc963d956b-kube-api-access-2vdkw\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764232 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-certificates\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764383 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764680 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9mvx\" (UniqueName: \"kubernetes.io/projected/b93f3c26-69a5-4220-956a-2f6bbf884c9a-kube-api-access-c9mvx\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764713 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e02d43f-9df8-4896-b407-23fc963d956b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764776 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.764812 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-config\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.765025 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.765101 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.765140 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a6e6b23-3c66-42a4-8f01-c8605efe1412-config\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.765179 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-bound-sa-token\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.765704 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.765772 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e02d43f-9df8-4896-b407-23fc963d956b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.772349 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.792739 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.811603 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.831769 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.867568 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.867764 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-trusted-ca\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.867789 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-dir\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.867842 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-dir\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: E1003 09:45:36.867838 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:37.367802783 +0000 UTC m=+119.164434710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.867881 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcjtc\" (UniqueName: \"kubernetes.io/projected/dc59e860-cd3f-40b8-bc9a-a26ae53e89d6-kube-api-access-vcjtc\") pod \"package-server-manager-789f6589d5-x692t\" (UID: \"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.867900 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bd5d267-6907-4d32-9620-dd6270e911f7-config-volume\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.867919 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-registration-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868011 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/130e5354-c1dd-4314-a44a-21e77603424e-webhook-cert\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868105 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4qn\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-kube-api-access-jb4qn\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868495 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e02d43f-9df8-4896-b407-23fc963d956b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868544 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e53bb259-c96c-4369-bc00-afbbf9a13eef-srv-cert\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868658 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a6e6b23-3c66-42a4-8f01-c8605efe1412-serving-cert\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868727 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89m68\" (UniqueName: \"kubernetes.io/projected/509b877d-1d79-4882-96f7-725d81a000cd-kube-api-access-89m68\") pod \"migrator-59844c95c7-29mz5\" (UID: \"509b877d-1d79-4882-96f7-725d81a000cd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868827 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-certs\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868862 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95h4f\" (UniqueName: \"kubernetes.io/projected/50514b1c-335c-4d26-8ce2-a918e0d262da-kube-api-access-95h4f\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868904 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7251f858-d281-4566-a08d-e181cccf0542-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.868993 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cfd213f1-47b0-4488-ae7f-9e76c7538733-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869215 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgmwj\" (UniqueName: \"kubernetes.io/projected/a518ef04-5379-442a-89d6-f0ab745e35c6-kube-api-access-vgmwj\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869271 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869318 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlf8\" (UniqueName: \"kubernetes.io/projected/cd1a02a7-9b7a-417f-a2b7-7421705a3010-kube-api-access-jjlf8\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869346 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a518ef04-5379-442a-89d6-f0ab745e35c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869367 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-node-bootstrap-token\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869390 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f47486b-dcdc-457c-b6d1-2956bc3155f0-config-volume\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869413 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-default-certificate\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869430 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7251f858-d281-4566-a08d-e181cccf0542-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869449 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/130e5354-c1dd-4314-a44a-21e77603424e-tmpfs\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869468 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xqf\" (UniqueName: \"kubernetes.io/projected/92e83dbe-2e24-432d-9a97-1a15a829b009-kube-api-access-h4xqf\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869521 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a42cdb61-90ff-40c9-8b18-95b86f1dfc3a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r8p2v\" (UID: \"a42cdb61-90ff-40c9-8b18-95b86f1dfc3a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869543 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869660 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1469c562-1020-4d06-8018-6fd61392d855-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.869694 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-stats-auth\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.870343 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.870428 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfcm2\" (UniqueName: \"kubernetes.io/projected/3a6e6b23-3c66-42a4-8f01-c8605efe1412-kube-api-access-dfcm2\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.870462 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a518ef04-5379-442a-89d6-f0ab745e35c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.870477 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1469c562-1020-4d06-8018-6fd61392d855-proxy-tls\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.870533 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-csi-data-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.870684 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50514b1c-335c-4d26-8ce2-a918e0d262da-serving-cert\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871059 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7dn6\" (UniqueName: \"kubernetes.io/projected/1469c562-1020-4d06-8018-6fd61392d855-kube-api-access-t7dn6\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871116 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871199 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a518ef04-5379-442a-89d6-f0ab745e35c6-images\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871248 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc9n5\" (UniqueName: \"kubernetes.io/projected/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-kube-api-access-jc9n5\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871355 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-plugins-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871392 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/becdef88-76d3-402a-b26f-23a4cbdf1644-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871453 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a6e6b23-3c66-42a4-8f01-c8605efe1412-trusted-ca\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871476 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871496 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01e8c510-4ff7-4758-b452-bdcc3eb6023b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871543 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01e8c510-4ff7-4758-b452-bdcc3eb6023b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.871950 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-trusted-ca\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.872310 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a518ef04-5379-442a-89d6-f0ab745e35c6-images\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.875455 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/becdef88-76d3-402a-b26f-23a4cbdf1644-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.875551 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a6e6b23-3c66-42a4-8f01-c8605efe1412-trusted-ca\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.875701 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-tls\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.875881 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/becdef88-76d3-402a-b26f-23a4cbdf1644-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.875962 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b824bdb2-af49-45eb-9ee4-d74f8f5461fd-cert\") pod \"ingress-canary-sgmrx\" (UID: \"b824bdb2-af49-45eb-9ee4-d74f8f5461fd\") " pod="openshift-ingress-canary/ingress-canary-sgmrx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876319 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1469c562-1020-4d06-8018-6fd61392d855-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876456 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876687 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876718 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b93f3c26-69a5-4220-956a-2f6bbf884c9a-service-ca-bundle\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876760 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876786 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vdkw\" (UniqueName: \"kubernetes.io/projected/9e02d43f-9df8-4896-b407-23fc963d956b-kube-api-access-2vdkw\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876823 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7wlh\" (UniqueName: \"kubernetes.io/projected/0c77dc95-2e7f-47c7-821d-c94a13f18f32-kube-api-access-b7wlh\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876864 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-certificates\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876895 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfx5x\" (UniqueName: \"kubernetes.io/projected/a42cdb61-90ff-40c9-8b18-95b86f1dfc3a-kube-api-access-gfx5x\") pod \"control-plane-machine-set-operator-78cbb6b69f-r8p2v\" (UID: \"a42cdb61-90ff-40c9-8b18-95b86f1dfc3a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876934 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876965 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbjvm\" (UniqueName: \"kubernetes.io/projected/7251f858-d281-4566-a08d-e181cccf0542-kube-api-access-mbjvm\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.876997 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9mvx\" (UniqueName: \"kubernetes.io/projected/b93f3c26-69a5-4220-956a-2f6bbf884c9a-kube-api-access-c9mvx\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877019 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e02d43f-9df8-4896-b407-23fc963d956b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877046 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc59e860-cd3f-40b8-bc9a-a26ae53e89d6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x692t\" (UID: \"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877260 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e53bb259-c96c-4369-bc00-afbbf9a13eef-profile-collector-cert\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877307 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877332 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/093acc63-fa82-4f09-a668-95aea75f0352-signing-key\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877355 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bd5d267-6907-4d32-9620-dd6270e911f7-secret-volume\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877381 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccn72\" (UniqueName: \"kubernetes.io/projected/5bd5d267-6907-4d32-9620-dd6270e911f7-kube-api-access-ccn72\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877405 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cfd213f1-47b0-4488-ae7f-9e76c7538733-srv-cert\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877449 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-config\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877498 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877544 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877579 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a6e6b23-3c66-42a4-8f01-c8605efe1412-config\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877606 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-796s6\" (UniqueName: \"kubernetes.io/projected/e53bb259-c96c-4369-bc00-afbbf9a13eef-kube-api-access-796s6\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877652 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-bound-sa-token\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877678 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxdpf\" (UniqueName: \"kubernetes.io/projected/130e5354-c1dd-4314-a44a-21e77603424e-kube-api-access-kxdpf\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.877723 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889267 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889396 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889433 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/130e5354-c1dd-4314-a44a-21e77603424e-apiservice-cert\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889466 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50514b1c-335c-4d26-8ce2-a918e0d262da-config\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889496 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29b9k\" (UniqueName: \"kubernetes.io/projected/b824bdb2-af49-45eb-9ee4-d74f8f5461fd-kube-api-access-29b9k\") pod \"ingress-canary-sgmrx\" (UID: \"b824bdb2-af49-45eb-9ee4-d74f8f5461fd\") " pod="openshift-ingress-canary/ingress-canary-sgmrx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889767 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e02d43f-9df8-4896-b407-23fc963d956b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889800 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f47486b-dcdc-457c-b6d1-2956bc3155f0-metrics-tls\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889852 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqvf\" (UniqueName: \"kubernetes.io/projected/cfd213f1-47b0-4488-ae7f-9e76c7538733-kube-api-access-5jqvf\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889930 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-socket-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889964 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a518ef04-5379-442a-89d6-f0ab745e35c6-proxy-tls\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.889994 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/093acc63-fa82-4f09-a668-95aea75f0352-signing-cabundle\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890023 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzqq6\" (UniqueName: \"kubernetes.io/projected/093acc63-fa82-4f09-a668-95aea75f0352-kube-api-access-lzqq6\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890068 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890100 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e8c510-4ff7-4758-b452-bdcc3eb6023b-config\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890136 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-policies\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890161 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-metrics-certs\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890190 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-mountpoint-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890219 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890249 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890336 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chvv6\" (UniqueName: \"kubernetes.io/projected/3f47486b-dcdc-457c-b6d1-2956bc3155f0-kube-api-access-chvv6\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890594 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b93f3c26-69a5-4220-956a-2f6bbf884c9a-service-ca-bundle\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.890878 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.891306 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/becdef88-76d3-402a-b26f-23a4cbdf1644-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.891383 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01e8c510-4ff7-4758-b452-bdcc3eb6023b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.891841 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a6e6b23-3c66-42a4-8f01-c8605efe1412-serving-cert\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.894221 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-certificates\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.897736 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.898593 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1469c562-1020-4d06-8018-6fd61392d855-proxy-tls\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.898724 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-default-certificate\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.902581 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.904192 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.904609 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-tls\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.904875 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5gscv"] Oct 03 09:45:36 crc kubenswrapper[4990]: E1003 09:45:36.905580 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:37.405557186 +0000 UTC m=+119.202189043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.908801 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a518ef04-5379-442a-89d6-f0ab745e35c6-proxy-tls\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.911292 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-stats-auth\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.916529 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a6e6b23-3c66-42a4-8f01-c8605efe1412-config\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.920914 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-config\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.922312 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e02d43f-9df8-4896-b407-23fc963d956b-trusted-ca\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.922680 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e02d43f-9df8-4896-b407-23fc963d956b-metrics-tls\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.923350 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-policies\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.924586 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e02d43f-9df8-4896-b407-23fc963d956b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.930275 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e8c510-4ff7-4758-b452-bdcc3eb6023b-config\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.931599 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.932468 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b93f3c26-69a5-4220-956a-2f6bbf884c9a-metrics-certs\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.943217 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4qn\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-kube-api-access-jb4qn\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.945647 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.945837 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.945875 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.946159 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.946159 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.946949 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.951417 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.955732 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89m68\" (UniqueName: \"kubernetes.io/projected/509b877d-1d79-4882-96f7-725d81a000cd-kube-api-access-89m68\") pod \"migrator-59844c95c7-29mz5\" (UID: \"509b877d-1d79-4882-96f7-725d81a000cd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.973276 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlf8\" (UniqueName: \"kubernetes.io/projected/cd1a02a7-9b7a-417f-a2b7-7421705a3010-kube-api-access-jjlf8\") pod \"oauth-openshift-558db77b4-wnxdn\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.992862 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.993671 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.993921 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50514b1c-335c-4d26-8ce2-a918e0d262da-config\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.993946 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29b9k\" (UniqueName: \"kubernetes.io/projected/b824bdb2-af49-45eb-9ee4-d74f8f5461fd-kube-api-access-29b9k\") pod \"ingress-canary-sgmrx\" (UID: \"b824bdb2-af49-45eb-9ee4-d74f8f5461fd\") " pod="openshift-ingress-canary/ingress-canary-sgmrx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.993981 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f47486b-dcdc-457c-b6d1-2956bc3155f0-metrics-tls\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994001 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqvf\" (UniqueName: \"kubernetes.io/projected/cfd213f1-47b0-4488-ae7f-9e76c7538733-kube-api-access-5jqvf\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994039 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-socket-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994059 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/093acc63-fa82-4f09-a668-95aea75f0352-signing-cabundle\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994077 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzqq6\" (UniqueName: \"kubernetes.io/projected/093acc63-fa82-4f09-a668-95aea75f0352-kube-api-access-lzqq6\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994121 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-mountpoint-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994156 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chvv6\" (UniqueName: \"kubernetes.io/projected/3f47486b-dcdc-457c-b6d1-2956bc3155f0-kube-api-access-chvv6\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994181 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcjtc\" (UniqueName: \"kubernetes.io/projected/dc59e860-cd3f-40b8-bc9a-a26ae53e89d6-kube-api-access-vcjtc\") pod \"package-server-manager-789f6589d5-x692t\" (UID: \"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994206 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bd5d267-6907-4d32-9620-dd6270e911f7-config-volume\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994229 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-registration-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994258 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/130e5354-c1dd-4314-a44a-21e77603424e-webhook-cert\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994281 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e53bb259-c96c-4369-bc00-afbbf9a13eef-srv-cert\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994302 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-certs\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994326 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95h4f\" (UniqueName: \"kubernetes.io/projected/50514b1c-335c-4d26-8ce2-a918e0d262da-kube-api-access-95h4f\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994343 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7251f858-d281-4566-a08d-e181cccf0542-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994360 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cfd213f1-47b0-4488-ae7f-9e76c7538733-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994388 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-node-bootstrap-token\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994405 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f47486b-dcdc-457c-b6d1-2956bc3155f0-config-volume\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994427 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7251f858-d281-4566-a08d-e181cccf0542-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994443 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/130e5354-c1dd-4314-a44a-21e77603424e-tmpfs\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994484 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xqf\" (UniqueName: \"kubernetes.io/projected/92e83dbe-2e24-432d-9a97-1a15a829b009-kube-api-access-h4xqf\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994530 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a42cdb61-90ff-40c9-8b18-95b86f1dfc3a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r8p2v\" (UID: \"a42cdb61-90ff-40c9-8b18-95b86f1dfc3a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994556 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994576 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-csi-data-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994606 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50514b1c-335c-4d26-8ce2-a918e0d262da-serving-cert\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994639 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc9n5\" (UniqueName: \"kubernetes.io/projected/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-kube-api-access-jc9n5\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994663 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-plugins-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994699 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b824bdb2-af49-45eb-9ee4-d74f8f5461fd-cert\") pod \"ingress-canary-sgmrx\" (UID: \"b824bdb2-af49-45eb-9ee4-d74f8f5461fd\") " pod="openshift-ingress-canary/ingress-canary-sgmrx" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994739 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7wlh\" (UniqueName: \"kubernetes.io/projected/0c77dc95-2e7f-47c7-821d-c94a13f18f32-kube-api-access-b7wlh\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994764 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfx5x\" (UniqueName: \"kubernetes.io/projected/a42cdb61-90ff-40c9-8b18-95b86f1dfc3a-kube-api-access-gfx5x\") pod \"control-plane-machine-set-operator-78cbb6b69f-r8p2v\" (UID: \"a42cdb61-90ff-40c9-8b18-95b86f1dfc3a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994787 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbjvm\" (UniqueName: \"kubernetes.io/projected/7251f858-d281-4566-a08d-e181cccf0542-kube-api-access-mbjvm\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994810 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc59e860-cd3f-40b8-bc9a-a26ae53e89d6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x692t\" (UID: \"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994826 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e53bb259-c96c-4369-bc00-afbbf9a13eef-profile-collector-cert\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994914 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cfd213f1-47b0-4488-ae7f-9e76c7538733-srv-cert\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994940 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/093acc63-fa82-4f09-a668-95aea75f0352-signing-key\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994961 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bd5d267-6907-4d32-9620-dd6270e911f7-secret-volume\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994978 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccn72\" (UniqueName: \"kubernetes.io/projected/5bd5d267-6907-4d32-9620-dd6270e911f7-kube-api-access-ccn72\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.994998 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-796s6\" (UniqueName: \"kubernetes.io/projected/e53bb259-c96c-4369-bc00-afbbf9a13eef-kube-api-access-796s6\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.995019 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxdpf\" (UniqueName: \"kubernetes.io/projected/130e5354-c1dd-4314-a44a-21e77603424e-kube-api-access-kxdpf\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.995041 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.995070 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/130e5354-c1dd-4314-a44a-21e77603424e-apiservice-cert\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.995941 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7251f858-d281-4566-a08d-e181cccf0542-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:36 crc kubenswrapper[4990]: E1003 09:45:36.996030 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:37.496011842 +0000 UTC m=+119.292643699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:36 crc kubenswrapper[4990]: I1003 09:45:36.996487 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50514b1c-335c-4d26-8ce2-a918e0d262da-config\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.000109 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/130e5354-c1dd-4314-a44a-21e77603424e-apiservice-cert\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.000422 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/130e5354-c1dd-4314-a44a-21e77603424e-tmpfs\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.001011 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-registration-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.001230 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-socket-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.001248 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f47486b-dcdc-457c-b6d1-2956bc3155f0-metrics-tls\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.002589 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/093acc63-fa82-4f09-a668-95aea75f0352-signing-cabundle\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.002785 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-mountpoint-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.005009 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8"] Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.005118 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/130e5354-c1dd-4314-a44a-21e77603424e-webhook-cert\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.005250 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bd5d267-6907-4d32-9620-dd6270e911f7-config-volume\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.005397 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a42cdb61-90ff-40c9-8b18-95b86f1dfc3a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r8p2v\" (UID: \"a42cdb61-90ff-40c9-8b18-95b86f1dfc3a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.005562 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.005650 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-plugins-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.005952 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/92e83dbe-2e24-432d-9a97-1a15a829b009-csi-data-dir\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.008941 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f47486b-dcdc-457c-b6d1-2956bc3155f0-config-volume\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.010223 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cfd213f1-47b0-4488-ae7f-9e76c7538733-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.011785 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e53bb259-c96c-4369-bc00-afbbf9a13eef-profile-collector-cert\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.012051 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/093acc63-fa82-4f09-a668-95aea75f0352-signing-key\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.012361 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b824bdb2-af49-45eb-9ee4-d74f8f5461fd-cert\") pod \"ingress-canary-sgmrx\" (UID: \"b824bdb2-af49-45eb-9ee4-d74f8f5461fd\") " pod="openshift-ingress-canary/ingress-canary-sgmrx" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.012620 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc59e860-cd3f-40b8-bc9a-a26ae53e89d6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x692t\" (UID: \"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.012686 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e53bb259-c96c-4369-bc00-afbbf9a13eef-srv-cert\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.012741 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7251f858-d281-4566-a08d-e181cccf0542-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.013045 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgmwj\" (UniqueName: \"kubernetes.io/projected/a518ef04-5379-442a-89d6-f0ab745e35c6-kube-api-access-vgmwj\") pod \"machine-config-operator-74547568cd-4dklv\" (UID: \"a518ef04-5379-442a-89d6-f0ab745e35c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.013317 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.014431 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cfd213f1-47b0-4488-ae7f-9e76c7538733-srv-cert\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.014696 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bd5d267-6907-4d32-9620-dd6270e911f7-secret-volume\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.015172 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50514b1c-335c-4d26-8ce2-a918e0d262da-serving-cert\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.016525 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-node-bootstrap-token\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.027059 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfcm2\" (UniqueName: \"kubernetes.io/projected/3a6e6b23-3c66-42a4-8f01-c8605efe1412-kube-api-access-dfcm2\") pod \"console-operator-58897d9998-zr9m5\" (UID: \"3a6e6b23-3c66-42a4-8f01-c8605efe1412\") " pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.027294 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-certs\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.039488 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7dn6\" (UniqueName: \"kubernetes.io/projected/1469c562-1020-4d06-8018-6fd61392d855-kube-api-access-t7dn6\") pod \"machine-config-controller-84d6567774-7qwq7\" (UID: \"1469c562-1020-4d06-8018-6fd61392d855\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.059616 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01e8c510-4ff7-4758-b452-bdcc3eb6023b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rr74z\" (UID: \"01e8c510-4ff7-4758-b452-bdcc3eb6023b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.072501 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mmp9d\" (UID: \"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.081750 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qn5kr"] Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.097005 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.097813 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:37.597795853 +0000 UTC m=+119.394427710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.099117 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vdkw\" (UniqueName: \"kubernetes.io/projected/9e02d43f-9df8-4896-b407-23fc963d956b-kube-api-access-2vdkw\") pod \"ingress-operator-5b745b69d9-l96ff\" (UID: \"9e02d43f-9df8-4896-b407-23fc963d956b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.118973 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9mvx\" (UniqueName: \"kubernetes.io/projected/b93f3c26-69a5-4220-956a-2f6bbf884c9a-kube-api-access-c9mvx\") pod \"router-default-5444994796-hdl6b\" (UID: \"b93f3c26-69a5-4220-956a-2f6bbf884c9a\") " pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.156544 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-bound-sa-token\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:37 crc kubenswrapper[4990]: W1003 09:45:37.172810 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ebd5bf8_15f2_4538_b8ad_08504265e855.slice/crio-442b1f91879937eb68af489d1404ecc9eb63cf9d7adb4ad6605b721f9b1f9c01 WatchSource:0}: Error finding container 442b1f91879937eb68af489d1404ecc9eb63cf9d7adb4ad6605b721f9b1f9c01: Status 404 returned error can't find the container with id 442b1f91879937eb68af489d1404ecc9eb63cf9d7adb4ad6605b721f9b1f9c01 Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.175702 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29b9k\" (UniqueName: \"kubernetes.io/projected/b824bdb2-af49-45eb-9ee4-d74f8f5461fd-kube-api-access-29b9k\") pod \"ingress-canary-sgmrx\" (UID: \"b824bdb2-af49-45eb-9ee4-d74f8f5461fd\") " pod="openshift-ingress-canary/ingress-canary-sgmrx" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.186610 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xqf\" (UniqueName: \"kubernetes.io/projected/92e83dbe-2e24-432d-9a97-1a15a829b009-kube-api-access-h4xqf\") pod \"csi-hostpathplugin-69dts\" (UID: \"92e83dbe-2e24-432d-9a97-1a15a829b009\") " pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.198272 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.198722 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:37.69870443 +0000 UTC m=+119.495336287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.203070 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.211080 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.212920 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqvf\" (UniqueName: \"kubernetes.io/projected/cfd213f1-47b0-4488-ae7f-9e76c7538733-kube-api-access-5jqvf\") pod \"olm-operator-6b444d44fb-6vmzk\" (UID: \"cfd213f1-47b0-4488-ae7f-9e76c7538733\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.233416 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzqq6\" (UniqueName: \"kubernetes.io/projected/093acc63-fa82-4f09-a668-95aea75f0352-kube-api-access-lzqq6\") pod \"service-ca-9c57cc56f-rwrlt\" (UID: \"093acc63-fa82-4f09-a668-95aea75f0352\") " pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.233485 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.239264 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.253384 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chvv6\" (UniqueName: \"kubernetes.io/projected/3f47486b-dcdc-457c-b6d1-2956bc3155f0-kube-api-access-chvv6\") pod \"dns-default-gsvpm\" (UID: \"3f47486b-dcdc-457c-b6d1-2956bc3155f0\") " pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.253816 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.255161 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-llj5c"] Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.271310 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.276792 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.278931 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcjtc\" (UniqueName: \"kubernetes.io/projected/dc59e860-cd3f-40b8-bc9a-a26ae53e89d6-kube-api-access-vcjtc\") pod \"package-server-manager-789f6589d5-x692t\" (UID: \"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.283416 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.287368 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qvmfh"] Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.290541 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbjvm\" (UniqueName: \"kubernetes.io/projected/7251f858-d281-4566-a08d-e181cccf0542-kube-api-access-mbjvm\") pod \"kube-storage-version-migrator-operator-b67b599dd-skjmc\" (UID: \"7251f858-d281-4566-a08d-e181cccf0542\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.300595 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.301404 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:37.801379693 +0000 UTC m=+119.598011560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.308496 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.313603 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc9n5\" (UniqueName: \"kubernetes.io/projected/1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747-kube-api-access-jc9n5\") pod \"machine-config-server-flvhx\" (UID: \"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747\") " pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.323106 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.329850 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7wlh\" (UniqueName: \"kubernetes.io/projected/0c77dc95-2e7f-47c7-821d-c94a13f18f32-kube-api-access-b7wlh\") pod \"marketplace-operator-79b997595-prdsm\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.342183 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.352314 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfx5x\" (UniqueName: \"kubernetes.io/projected/a42cdb61-90ff-40c9-8b18-95b86f1dfc3a-kube-api-access-gfx5x\") pod \"control-plane-machine-set-operator-78cbb6b69f-r8p2v\" (UID: \"a42cdb61-90ff-40c9-8b18-95b86f1dfc3a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.355545 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.377966 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccn72\" (UniqueName: \"kubernetes.io/projected/5bd5d267-6907-4d32-9620-dd6270e911f7-kube-api-access-ccn72\") pod \"collect-profiles-29324745-jtl78\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.392291 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95h4f\" (UniqueName: \"kubernetes.io/projected/50514b1c-335c-4d26-8ce2-a918e0d262da-kube-api-access-95h4f\") pod \"service-ca-operator-777779d784-gtsmk\" (UID: \"50514b1c-335c-4d26-8ce2-a918e0d262da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.394708 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.406037 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.406160 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.406625 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:37.906609033 +0000 UTC m=+119.703240890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.412324 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.422153 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-flvhx" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.433426 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.437878 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-796s6\" (UniqueName: \"kubernetes.io/projected/e53bb259-c96c-4369-bc00-afbbf9a13eef-kube-api-access-796s6\") pod \"catalog-operator-68c6474976-6mnsf\" (UID: \"e53bb259-c96c-4369-bc00-afbbf9a13eef\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.455971 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxdpf\" (UniqueName: \"kubernetes.io/projected/130e5354-c1dd-4314-a44a-21e77603424e-kube-api-access-kxdpf\") pod \"packageserver-d55dfcdfc-h2n67\" (UID: \"130e5354-c1dd-4314-a44a-21e77603424e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.459780 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sgmrx" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.474220 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-69dts" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.512708 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.513149 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.013129057 +0000 UTC m=+119.809760914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.545926 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5"] Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.605931 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d"] Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.617794 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.620710 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.621132 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.121116689 +0000 UTC m=+119.917748546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.632803 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.633681 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" event={"ID":"e4fbb0a8-9cb0-49b1-9e81-77c291382938","Type":"ContainerStarted","Data":"278f2b9b88279fca33a555da879aeb21831e7105a42fcac8280ef8104dfca704"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.640353 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" event={"ID":"8637f893-27db-425a-85aa-8d8d5c6e1dba","Type":"ContainerStarted","Data":"2426818a5960f89c9cd37544e30aca7aa848939254a0b2e118cee162c6f75949"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.640418 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" event={"ID":"8637f893-27db-425a-85aa-8d8d5c6e1dba","Type":"ContainerStarted","Data":"1757b0f17f655169467794dd13af00cca023e29b9bd8acba59047837300c6602"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.640428 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" event={"ID":"8637f893-27db-425a-85aa-8d8d5c6e1dba","Type":"ContainerStarted","Data":"aa58b841eba59177d7483b20ba44efac9749b0b67972df08655704f9d8404262"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.653871 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" event={"ID":"7e3d6d60-b354-4999-a205-80a71688caec","Type":"ContainerStarted","Data":"cee26b83a031e00252ee23a6e6d7cc5b66a576c524db357c13b023b9bb550780"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.653949 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" event={"ID":"7e3d6d60-b354-4999-a205-80a71688caec","Type":"ContainerStarted","Data":"26e2b8a989c63e4224b789d59de90a40cf5cf96ef07866d60386a8ca3e2bbacb"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.654030 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.657425 4990 generic.go:334] "Generic (PLEG): container finished" podID="f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6" containerID="5ef3bcb33fc5e7dffeb93069880339ab50bab3ce4681a0c78391cfed60a61e3c" exitCode=0 Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.657778 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" event={"ID":"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6","Type":"ContainerDied","Data":"5ef3bcb33fc5e7dffeb93069880339ab50bab3ce4681a0c78391cfed60a61e3c"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.665966 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" event={"ID":"9680507e-96dc-43bd-ade2-61985d923ddc","Type":"ContainerStarted","Data":"0a68b4d70c80270b749710cc72871125ef2b821effc846141e87ce893cb243f7"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.666010 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" event={"ID":"9680507e-96dc-43bd-ade2-61985d923ddc","Type":"ContainerStarted","Data":"0d2008e0b99ed5b72ef874bf97f97bbedf2b852018833305510da9e776649ff7"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.666718 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" event={"ID":"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2","Type":"ContainerStarted","Data":"f9a5f92f2d917dfbea48a31fb8c614108d933de4696f8b7162926e47a9c648cf"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.669296 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" event={"ID":"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5","Type":"ContainerStarted","Data":"499215ab2870f03e5292fc673a81f3fc6eaf259aada1ef73b696b078055351dd"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.669324 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" event={"ID":"53edcb9e-1d3f-48bf-a457-e1f3ec65bba5","Type":"ContainerStarted","Data":"009ebeac5a0598149dd2c10707b13b35acd9096c5f373c0ca008b0457b0b50f4"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.673977 4990 generic.go:334] "Generic (PLEG): container finished" podID="650ac140-20eb-4503-bcf2-f9795cebaff2" containerID="99afb063a4229f9c1f3e43e84328df08c68c743d86246a62b8fef36837203e52" exitCode=0 Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.674120 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" event={"ID":"650ac140-20eb-4503-bcf2-f9795cebaff2","Type":"ContainerDied","Data":"99afb063a4229f9c1f3e43e84328df08c68c743d86246a62b8fef36837203e52"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.679952 4990 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5gscv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.680026 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" podUID="7e3d6d60-b354-4999-a205-80a71688caec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.682622 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.691736 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" event={"ID":"bc74c56c-b4ba-479b-87ba-ba707c62af66","Type":"ContainerStarted","Data":"ab3a8cbfd9502e38cd94942137e4678b14209ebba98713e24a82ebc053ee6eab"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.692386 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" event={"ID":"bc74c56c-b4ba-479b-87ba-ba707c62af66","Type":"ContainerStarted","Data":"3e305c89ce9c0e720b3b82d2f84dfb893af896910a6a6a9b7803f736ccf9bb50"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.696378 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" event={"ID":"554eeba0-898e-4fd2-9308-2e3028b4e8ae","Type":"ContainerStarted","Data":"613ba0d3539f2f3119eb9065ed8f66b25375493fcc8170d5853482c7456f615e"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.698595 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f5nfx" event={"ID":"60f1ddd8-ebdb-4575-b06e-619cbe196937","Type":"ContainerStarted","Data":"3405ca64013c864f3f27fad0264e81c5ff88e465d8cf6410b679b57af834ec47"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.698617 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f5nfx" event={"ID":"60f1ddd8-ebdb-4575-b06e-619cbe196937","Type":"ContainerStarted","Data":"7171fef666b60c89a9bbe48236fc2b11c61eccdbe64a45d110809d08e3cf67d8"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.700047 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-f5nfx" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.715346 4990 patch_prober.go:28] interesting pod/downloads-7954f5f757-f5nfx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.715487 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f5nfx" podUID="60f1ddd8-ebdb-4575-b06e-619cbe196937" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.716633 4990 generic.go:334] "Generic (PLEG): container finished" podID="4a6095bd-ea6a-4e7b-8504-89aa4704a720" containerID="21748c43a60bc3783e6a93a6d71c5290211d9346ca5353543fe392f720ce0b60" exitCode=0 Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.719317 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" event={"ID":"4a6095bd-ea6a-4e7b-8504-89aa4704a720","Type":"ContainerDied","Data":"21748c43a60bc3783e6a93a6d71c5290211d9346ca5353543fe392f720ce0b60"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.722375 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.722805 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.222777057 +0000 UTC m=+120.019408914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.788092 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" event={"ID":"85165410-2d0a-4af1-a60f-a1374e58c3f1","Type":"ContainerStarted","Data":"102ace9e4a5ca97149854f5263854fbeba04d69ade5a62c44e984fe8311efd27"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.818195 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" event={"ID":"4ebd5bf8-15f2-4538-b8ad-08504265e855","Type":"ContainerStarted","Data":"442b1f91879937eb68af489d1404ecc9eb63cf9d7adb4ad6605b721f9b1f9c01"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.823331 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.824941 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.324918517 +0000 UTC m=+120.121550374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.850678 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kftbs" podStartSLOduration=98.850657747 podStartE2EDuration="1m38.850657747s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:37.849533668 +0000 UTC m=+119.646165525" watchObservedRunningTime="2025-10-03 09:45:37.850657747 +0000 UTC m=+119.647289604" Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.853625 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" event={"ID":"0450636d-e918-475a-af84-690cac4baa47","Type":"ContainerStarted","Data":"dc1468edacc205d487a893b5295e4625e0a7b8bf787fed27a291bbca0aed8604"} Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.854378 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:37 crc kubenswrapper[4990]: W1003 09:45:37.870547 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509b877d_1d79_4882_96f7_725d81a000cd.slice/crio-7ab285836238a417e6e83870c6cb8f90894e8fbcaa36bc472931a8290bb66725 WatchSource:0}: Error finding container 7ab285836238a417e6e83870c6cb8f90894e8fbcaa36bc472931a8290bb66725: Status 404 returned error can't find the container with id 7ab285836238a417e6e83870c6cb8f90894e8fbcaa36bc472931a8290bb66725 Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.876280 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7"] Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.879315 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wnxdn"] Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.894203 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5jgtm" podStartSLOduration=98.89417762 podStartE2EDuration="1m38.89417762s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:37.890763911 +0000 UTC m=+119.687395768" watchObservedRunningTime="2025-10-03 09:45:37.89417762 +0000 UTC m=+119.690809477" Oct 03 09:45:37 crc kubenswrapper[4990]: W1003 09:45:37.915851 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1469c562_1020_4d06_8018_6fd61392d855.slice/crio-db55cf35ad2691844e177d443616da10143cc120df0ba098f9c4c7339d4ee999 WatchSource:0}: Error finding container db55cf35ad2691844e177d443616da10143cc120df0ba098f9c4c7339d4ee999: Status 404 returned error can't find the container with id db55cf35ad2691844e177d443616da10143cc120df0ba098f9c4c7339d4ee999 Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.927088 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:37 crc kubenswrapper[4990]: E1003 09:45:37.930220 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.430201348 +0000 UTC m=+120.226833205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:37 crc kubenswrapper[4990]: I1003 09:45:37.951477 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zr9m5"] Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.035090 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.035401 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.535385277 +0000 UTC m=+120.332017134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.096775 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z"] Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.130326 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff"] Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.145681 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.146161 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.646146622 +0000 UTC m=+120.442778489 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.216842 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2br82" podStartSLOduration=99.216823642 podStartE2EDuration="1m39.216823642s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:38.183496284 +0000 UTC m=+119.980128141" watchObservedRunningTime="2025-10-03 09:45:38.216823642 +0000 UTC m=+120.013455499" Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.223339 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv"] Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.250077 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.250471 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.750452728 +0000 UTC m=+120.547084585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.336234 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.352433 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.353097 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.85304749 +0000 UTC m=+120.649679347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.456256 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.456648 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:38.956631227 +0000 UTC m=+120.753263084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.559004 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.560718 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.060699787 +0000 UTC m=+120.857331634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.660565 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.660963 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.160947738 +0000 UTC m=+120.957579595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.771210 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.771644 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.27163202 +0000 UTC m=+121.068263877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.900088 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.900483 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.400462955 +0000 UTC m=+121.197094812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.900837 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:38 crc kubenswrapper[4990]: E1003 09:45:38.901149 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.401140723 +0000 UTC m=+121.197772580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.903778 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9k9dv" podStartSLOduration=99.903752021 podStartE2EDuration="1m39.903752021s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:38.852422134 +0000 UTC m=+120.649053981" watchObservedRunningTime="2025-10-03 09:45:38.903752021 +0000 UTC m=+120.700383878" Oct 03 09:45:38 crc kubenswrapper[4990]: I1003 09:45:38.983492 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-f5nfx" podStartSLOduration=99.983472867 podStartE2EDuration="1m39.983472867s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:38.911894083 +0000 UTC m=+120.708525940" watchObservedRunningTime="2025-10-03 09:45:38.983472867 +0000 UTC m=+120.780104724" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.009122 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.010063 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.510029428 +0000 UTC m=+121.306661285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.023963 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hdl6b" event={"ID":"b93f3c26-69a5-4220-956a-2f6bbf884c9a","Type":"ContainerStarted","Data":"0779226bb6335c9e2fc586472605b7ec9b3fd39e406a4926ab70ec157dffd40f"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.046995 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" event={"ID":"4ebd5bf8-15f2-4538-b8ad-08504265e855","Type":"ContainerStarted","Data":"a7567dc574d949069f1d014cf7d89ec613549d09e8b6d958cbb8573885322f57"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.052734 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" podStartSLOduration=100.052705259 podStartE2EDuration="1m40.052705259s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:39.021397994 +0000 UTC m=+120.818029851" watchObservedRunningTime="2025-10-03 09:45:39.052705259 +0000 UTC m=+120.849337116" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.063816 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" event={"ID":"a518ef04-5379-442a-89d6-f0ab745e35c6","Type":"ContainerStarted","Data":"475fec590949bbf01f6feb016229d53aa5235f80cdd68569b073657fff572d05"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.077316 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" event={"ID":"3a6e6b23-3c66-42a4-8f01-c8605efe1412","Type":"ContainerStarted","Data":"b148ec517441e86cc4a24950a92906d743bd1051cd13613057ff44b5e5279245"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.089196 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7xjqr" podStartSLOduration=100.089170139 podStartE2EDuration="1m40.089170139s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:39.077030453 +0000 UTC m=+120.873662320" watchObservedRunningTime="2025-10-03 09:45:39.089170139 +0000 UTC m=+120.885801996" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.112052 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" event={"ID":"cd1a02a7-9b7a-417f-a2b7-7421705a3010","Type":"ContainerStarted","Data":"3c352f16110b7055b4cb4b9f88c5322946fe8fe57930b892de070fc7e816b869"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.112877 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.113187 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.613174494 +0000 UTC m=+121.409806351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.132187 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5n7fm" podStartSLOduration=100.132168469 podStartE2EDuration="1m40.132168469s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:39.106438129 +0000 UTC m=+120.903069996" watchObservedRunningTime="2025-10-03 09:45:39.132168469 +0000 UTC m=+120.928800326" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.132270 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" event={"ID":"509b877d-1d79-4882-96f7-725d81a000cd","Type":"ContainerStarted","Data":"7ab285836238a417e6e83870c6cb8f90894e8fbcaa36bc472931a8290bb66725"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.149775 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" event={"ID":"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1","Type":"ContainerStarted","Data":"518489559fbf76f61e302e33cf6df8a2d3832cd3f927b081dd79ced32f483883"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.151884 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.163558 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.216158 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.216459 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.716444813 +0000 UTC m=+121.513076670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.219988 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" event={"ID":"e4fbb0a8-9cb0-49b1-9e81-77c291382938","Type":"ContainerStarted","Data":"1b93e2ce45869a49d89f3b19ba8da0cd99f221d9d91374e14885261151587c98"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.237907 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" event={"ID":"85165410-2d0a-4af1-a60f-a1374e58c3f1","Type":"ContainerStarted","Data":"0dc366065886b4bb7ae523594946110c1403867bf637bf829d090f1020a85587"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.280340 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" event={"ID":"9e02d43f-9df8-4896-b407-23fc963d956b","Type":"ContainerStarted","Data":"351fe52d307b469e787ff034f8894297e6955b4794ff59f3f633c1aad8284b95"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.285082 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" event={"ID":"1469c562-1020-4d06-8018-6fd61392d855","Type":"ContainerStarted","Data":"db55cf35ad2691844e177d443616da10143cc120df0ba098f9c4c7339d4ee999"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.310364 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-flvhx" event={"ID":"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747","Type":"ContainerStarted","Data":"6875f5f1b0c825834b47dad6abe1919f491109d86984ad71291d9b3dd4eb1c20"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.319642 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.329824 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.829804255 +0000 UTC m=+121.626436112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.330325 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" event={"ID":"01e8c510-4ff7-4758-b452-bdcc3eb6023b","Type":"ContainerStarted","Data":"d5253c99cbfb45283fcf06261ad1f7ff22b7dac64b1e0d4a9c923d06178b6596"} Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.330392 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.333382 4990 patch_prober.go:28] interesting pod/downloads-7954f5f757-f5nfx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.333434 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f5nfx" podUID="60f1ddd8-ebdb-4575-b06e-619cbe196937" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.354430 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.365803 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rwrlt"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.387022 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.400554 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gsvpm"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.429569 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.432060 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:39.932038618 +0000 UTC m=+121.728670475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.503368 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-69dts"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.533437 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-prdsm"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.535176 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.536104 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.036089467 +0000 UTC m=+121.832721314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.551956 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sgmrx"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.564148 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.571810 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-85xqf" podStartSLOduration=100.571781427 podStartE2EDuration="1m40.571781427s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:39.510255194 +0000 UTC m=+121.306887061" watchObservedRunningTime="2025-10-03 09:45:39.571781427 +0000 UTC m=+121.368413284" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.611765 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.635999 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hqwkx" podStartSLOduration=101.635980318 podStartE2EDuration="1m41.635980318s" podCreationTimestamp="2025-10-03 09:43:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:39.63489593 +0000 UTC m=+121.431527797" watchObservedRunningTime="2025-10-03 09:45:39.635980318 +0000 UTC m=+121.432612175" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.637991 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.638400 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.138384511 +0000 UTC m=+121.935016368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.644745 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.679115 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78"] Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.744690 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.745053 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.245036988 +0000 UTC m=+122.041668845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.772635 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m8h8" podStartSLOduration=100.772619567 podStartE2EDuration="1m40.772619567s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:39.690916269 +0000 UTC m=+121.487548126" watchObservedRunningTime="2025-10-03 09:45:39.772619567 +0000 UTC m=+121.569251424" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.821257 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" podStartSLOduration=100.821228082 podStartE2EDuration="1m40.821228082s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:39.81960689 +0000 UTC m=+121.616238757" watchObservedRunningTime="2025-10-03 09:45:39.821228082 +0000 UTC m=+121.617859939" Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.846135 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.846445 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.346427789 +0000 UTC m=+122.143059646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:39 crc kubenswrapper[4990]: I1003 09:45:39.950651 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:39 crc kubenswrapper[4990]: E1003 09:45:39.951340 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.45132202 +0000 UTC m=+122.247953877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.059028 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.059244 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.55920917 +0000 UTC m=+122.355841027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.059743 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.060144 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.560128154 +0000 UTC m=+122.356760011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.160895 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.162019 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.661992946 +0000 UTC m=+122.458624813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.162087 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.163612 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.663597188 +0000 UTC m=+122.460229045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.266109 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.266684 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.766667202 +0000 UTC m=+122.563299059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.344498 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" event={"ID":"0c77dc95-2e7f-47c7-821d-c94a13f18f32","Type":"ContainerStarted","Data":"f49dfab5ba6806841e4b6439652ecdcee8c5b84e11b9ffc312dd8ada3f6f8ff9"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.352392 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" event={"ID":"a9fc8919-5fe8-4b93-b80e-f1ce53ef74c1","Type":"ContainerStarted","Data":"621313a6c52cd9ff604f310d9cb18e6e5d2efce43b3e403be48626a2157dcd50"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.355290 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69dts" event={"ID":"92e83dbe-2e24-432d-9a97-1a15a829b009","Type":"ContainerStarted","Data":"413974827f1dc8d3bd791335c37e0f59583b265613d91e4ff3f83db129ac5f7b"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.362732 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" event={"ID":"a518ef04-5379-442a-89d6-f0ab745e35c6","Type":"ContainerStarted","Data":"40292eb93477dc22769563d7d072cd2bf904b87374c1ca2236fad030b98ee19a"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.367618 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.368105 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.868068293 +0000 UTC m=+122.664700180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.371116 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" event={"ID":"9e02d43f-9df8-4896-b407-23fc963d956b","Type":"ContainerStarted","Data":"af0d23ac6fcbb8dafadc19d22b79443f195a4a1d84e6f95658e617fb741e3750"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.374382 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" event={"ID":"d1fb0b61-25d2-425b-9b4f-3d2dd9b339d2","Type":"ContainerStarted","Data":"efb80ab0d8afac51f2e3f080699ad909140a928c66629633ecc7538141906eb9"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.382602 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sgmrx" event={"ID":"b824bdb2-af49-45eb-9ee4-d74f8f5461fd","Type":"ContainerStarted","Data":"e3b325e8af7f3997133f9cfe874b8d08bc22fb7fe9bdd910dae398eb8ff34566"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.389769 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" event={"ID":"650ac140-20eb-4503-bcf2-f9795cebaff2","Type":"ContainerStarted","Data":"a508f1c450a9c163bb33f4160962c31a39ad9e94bd6d233e071ba374e9a49dc5"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.392459 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" event={"ID":"cd1a02a7-9b7a-417f-a2b7-7421705a3010","Type":"ContainerStarted","Data":"8c818eaef9847549849faf7b8076d9593ca5c83807568baafdd3e8d601b05c2b"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.393207 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.395477 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" event={"ID":"e53bb259-c96c-4369-bc00-afbbf9a13eef","Type":"ContainerStarted","Data":"d75a8ef78049d0fcba34a080417fa1995f8a201cad54a56989358a85a5d0695b"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.399643 4990 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wnxdn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.399690 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" podUID="cd1a02a7-9b7a-417f-a2b7-7421705a3010" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.413193 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" event={"ID":"4a6095bd-ea6a-4e7b-8504-89aa4704a720","Type":"ContainerStarted","Data":"3e1d64a0c2f7ace54eef2d7740490744b3955e7fee370229d8fa1df7d53703cd"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.417347 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" event={"ID":"f7b5f6d0-abed-4db0-87c4-52ee2afc7bd6","Type":"ContainerStarted","Data":"967ce71087d38afa1128b7e257734a41679c313dc9cf0feba3852d8c74e8d866"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.417382 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.422240 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mmp9d" podStartSLOduration=101.422215053 podStartE2EDuration="1m41.422215053s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.410662192 +0000 UTC m=+122.207294089" watchObservedRunningTime="2025-10-03 09:45:40.422215053 +0000 UTC m=+122.218846920" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.433791 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" event={"ID":"01e8c510-4ff7-4758-b452-bdcc3eb6023b","Type":"ContainerStarted","Data":"d5caf291d6faaace8559a75ca9184767707592b0e01a03297edb0d137c0caed0"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.469598 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.470795 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:40.970780148 +0000 UTC m=+122.767412005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.471858 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gsvpm" event={"ID":"3f47486b-dcdc-457c-b6d1-2956bc3155f0","Type":"ContainerStarted","Data":"75dbd19fb2b6b7a0ba757ca3355bf7accfdc28116a46d50d936a189400056104"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.478307 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" event={"ID":"50514b1c-335c-4d26-8ce2-a918e0d262da","Type":"ContainerStarted","Data":"bd3100847a31ce9c2c435541e62869afad6f580c4a0846a3af9c8fe67166c04e"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.478366 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" event={"ID":"50514b1c-335c-4d26-8ce2-a918e0d262da","Type":"ContainerStarted","Data":"6667e796b11e5d865a667f680bc2e6e4f5b01bfc5b09bdcf12208e39b9689826"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.492560 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" event={"ID":"4ebd5bf8-15f2-4538-b8ad-08504265e855","Type":"ContainerStarted","Data":"5ddcd4def6f1191b1c149e0cb96515f6b7450faf73327165d410b304da85ed35"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.499344 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" event={"ID":"7251f858-d281-4566-a08d-e181cccf0542","Type":"ContainerStarted","Data":"b620ec57289e45650b2afbead217aad4cb6711f6d8dc43c9ebf0f14b1696d4c1"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.504948 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" event={"ID":"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6","Type":"ContainerStarted","Data":"96d434893cf70117a32b6c1d2f3b71503e015a4426d50dc25fcd8d3e2d88ce7b"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.512627 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qvmfh" podStartSLOduration=101.512604907 podStartE2EDuration="1m41.512604907s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.447961393 +0000 UTC m=+122.244593250" watchObservedRunningTime="2025-10-03 09:45:40.512604907 +0000 UTC m=+122.309236764" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.517238 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" podStartSLOduration=101.517212157 podStartE2EDuration="1m41.517212157s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.515109342 +0000 UTC m=+122.311741209" watchObservedRunningTime="2025-10-03 09:45:40.517212157 +0000 UTC m=+122.313844014" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.518142 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-flvhx" event={"ID":"1abdbf5f-b06d-4cd1-86d3-0f5e6aba5747","Type":"ContainerStarted","Data":"b6f12eb4c6cfc545bf7426274b9b9e8c068ca3ec83c188f48e921d6eb3f62b38"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.556053 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hdl6b" event={"ID":"b93f3c26-69a5-4220-956a-2f6bbf884c9a","Type":"ContainerStarted","Data":"be0035f8126d72b957b65dd2670debe7493d6c35309127060bffda2d0f2199c3"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.575642 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.577819 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.077801284 +0000 UTC m=+122.874433241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.604853 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" podStartSLOduration=101.604825938 podStartE2EDuration="1m41.604825938s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.599087119 +0000 UTC m=+122.395718996" watchObservedRunningTime="2025-10-03 09:45:40.604825938 +0000 UTC m=+122.401457805" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.643672 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rr74z" podStartSLOduration=101.643639179 podStartE2EDuration="1m41.643639179s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.635641691 +0000 UTC m=+122.432273558" watchObservedRunningTime="2025-10-03 09:45:40.643639179 +0000 UTC m=+122.440271036" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.677286 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.677397 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.177375197 +0000 UTC m=+122.974007064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.678121 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.678870 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.178655991 +0000 UTC m=+122.975287848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.689134 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" event={"ID":"509b877d-1d79-4882-96f7-725d81a000cd","Type":"ContainerStarted","Data":"bf90c6dc245ebce6983e6ab7ba8ed29f272003d5e545f1b555788002cfa1feab"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.693400 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-qn5kr" podStartSLOduration=101.693376874 podStartE2EDuration="1m41.693376874s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.673678751 +0000 UTC m=+122.470310618" watchObservedRunningTime="2025-10-03 09:45:40.693376874 +0000 UTC m=+122.490008731" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.719620 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" event={"ID":"cfd213f1-47b0-4488-ae7f-9e76c7538733","Type":"ContainerStarted","Data":"e8bee5fcb63f045c4d92d26542aacb5460f06b6d0d1602a817d5e1aaa1b26475"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.719695 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" event={"ID":"cfd213f1-47b0-4488-ae7f-9e76c7538733","Type":"ContainerStarted","Data":"6572ca5b887bd6d4b5275a0028b077f22846958fac39df5f96fcf52bb61cb31b"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.721334 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.727815 4990 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-6vmzk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.727889 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" podUID="cfd213f1-47b0-4488-ae7f-9e76c7538733" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.730919 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" event={"ID":"554eeba0-898e-4fd2-9308-2e3028b4e8ae","Type":"ContainerStarted","Data":"6012f030e53d38965d1c4bc3cbe9753b072b44eaa0375d908d9222a021487274"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.741282 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" podStartSLOduration=101.741257001 podStartE2EDuration="1m41.741257001s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.739398273 +0000 UTC m=+122.536030140" watchObservedRunningTime="2025-10-03 09:45:40.741257001 +0000 UTC m=+122.537888858" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.772671 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" event={"ID":"3a6e6b23-3c66-42a4-8f01-c8605efe1412","Type":"ContainerStarted","Data":"f05a4bab5c9947687419be091d150908bd5d5ef77fd16ba9b9d6cf2f092afd98"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.773985 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.787715 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.789141 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.289118247 +0000 UTC m=+123.085750104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.793811 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" event={"ID":"a42cdb61-90ff-40c9-8b18-95b86f1dfc3a","Type":"ContainerStarted","Data":"ffd98f33dbb5a515111b5b5fb1975a1bd624f62923e5ab3b935769fa08c94e20"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.794817 4990 patch_prober.go:28] interesting pod/console-operator-58897d9998-zr9m5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.794885 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" podUID="3a6e6b23-3c66-42a4-8f01-c8605efe1412" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.807608 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" event={"ID":"093acc63-fa82-4f09-a668-95aea75f0352","Type":"ContainerStarted","Data":"4d344de27d8403ecb7530a578c6b78c4fe7437e90e329654969b2e3de2b880af"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.810894 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" event={"ID":"130e5354-c1dd-4314-a44a-21e77603424e","Type":"ContainerStarted","Data":"e483348ce2816eab12b942e3629dc97af9c996927fc362949d292d0dfc64d9ca"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.821370 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hdl6b" podStartSLOduration=101.821353597 podStartE2EDuration="1m41.821353597s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.81916342 +0000 UTC m=+122.615795277" watchObservedRunningTime="2025-10-03 09:45:40.821353597 +0000 UTC m=+122.617985454" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.838898 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" event={"ID":"1469c562-1020-4d06-8018-6fd61392d855","Type":"ContainerStarted","Data":"f68a301b75a1c440f157eda815332c05cb881c440858b9c853d24a64f8c3ba34"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.850319 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" event={"ID":"5bd5d267-6907-4d32-9620-dd6270e911f7","Type":"ContainerStarted","Data":"aa72d1f3a494eea8f66a21ee3f8a34a7d069cb4fe649c96606ac56dac4b84cb4"} Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.852640 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-flvhx" podStartSLOduration=6.852602651 podStartE2EDuration="6.852602651s" podCreationTimestamp="2025-10-03 09:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.851003799 +0000 UTC m=+122.647635656" watchObservedRunningTime="2025-10-03 09:45:40.852602651 +0000 UTC m=+122.649234498" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.854316 4990 patch_prober.go:28] interesting pod/downloads-7954f5f757-f5nfx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.854400 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f5nfx" podUID="60f1ddd8-ebdb-4575-b06e-619cbe196937" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.895402 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:40 crc kubenswrapper[4990]: E1003 09:45:40.896833 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.396810831 +0000 UTC m=+123.193442768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.907271 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" podStartSLOduration=101.907249213 podStartE2EDuration="1m41.907249213s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.90290693 +0000 UTC m=+122.699538787" watchObservedRunningTime="2025-10-03 09:45:40.907249213 +0000 UTC m=+122.703881070" Oct 03 09:45:40 crc kubenswrapper[4990]: I1003 09:45:40.968848 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gtsmk" podStartSLOduration=101.968811006 podStartE2EDuration="1m41.968811006s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:40.968016165 +0000 UTC m=+122.764648042" watchObservedRunningTime="2025-10-03 09:45:40.968811006 +0000 UTC m=+122.765442873" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.001667 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.006753 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.506720153 +0000 UTC m=+123.303352010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.104446 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.104924 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.6048948 +0000 UTC m=+123.401526657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.126838 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.126879 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.153073 4990 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-tjqxd container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.153146 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" podUID="650ac140-20eb-4503-bcf2-f9795cebaff2" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.188563 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" podStartSLOduration=102.188541068 podStartE2EDuration="1m42.188541068s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:41.021068817 +0000 UTC m=+122.817700684" watchObservedRunningTime="2025-10-03 09:45:41.188541068 +0000 UTC m=+122.985172925" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.205192 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.205541 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.70552477 +0000 UTC m=+123.502156617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.237238 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" podStartSLOduration=102.237216915 podStartE2EDuration="1m42.237216915s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:41.236658041 +0000 UTC m=+123.033289918" watchObservedRunningTime="2025-10-03 09:45:41.237216915 +0000 UTC m=+123.033848772" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.239685 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" podStartSLOduration=102.23967776 podStartE2EDuration="1m42.23967776s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:41.189202825 +0000 UTC m=+122.985834692" watchObservedRunningTime="2025-10-03 09:45:41.23967776 +0000 UTC m=+123.036309617" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.254060 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.272441 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:41 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:41 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:41 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.272543 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.306544 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.307041 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.807023473 +0000 UTC m=+123.603655330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.407705 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.408092 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:41.908071985 +0000 UTC m=+123.704703842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.509772 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.510190 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.010174094 +0000 UTC m=+123.806805951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.610300 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.610710 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.11066546 +0000 UTC m=+123.907297317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.610917 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.611209 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.111196994 +0000 UTC m=+123.907828851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.712376 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.712635 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.212604985 +0000 UTC m=+124.009236842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.712892 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.713312 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.213304413 +0000 UTC m=+124.009936270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.814403 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.814896 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.314878198 +0000 UTC m=+124.111510065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.883134 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sgmrx" event={"ID":"b824bdb2-af49-45eb-9ee4-d74f8f5461fd","Type":"ContainerStarted","Data":"354ab3b77ccf84bc1a242480fadf5e5921a757ea9c92f0528ca11e6cd360e44e"} Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.905959 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-skjmc" event={"ID":"7251f858-d281-4566-a08d-e181cccf0542","Type":"ContainerStarted","Data":"8fc76ee5bc22d1b8afbbaa11e92937b9abf1e2b370829113b1e2a06dd3e75aac"} Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.917093 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" event={"ID":"9e02d43f-9df8-4896-b407-23fc963d956b","Type":"ContainerStarted","Data":"3be138fab03a1caa592ca48a63fdf06c2fe926f4d34cabe0e3b4f68251375840"} Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.919532 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:41 crc kubenswrapper[4990]: E1003 09:45:41.920029 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.420013276 +0000 UTC m=+124.216645133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.938931 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" event={"ID":"a518ef04-5379-442a-89d6-f0ab745e35c6","Type":"ContainerStarted","Data":"dac297826c6f80f6659cf0baef9789078a4e3034b7e8c9d65743f0801e72c045"} Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.974125 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" event={"ID":"130e5354-c1dd-4314-a44a-21e77603424e","Type":"ContainerStarted","Data":"32d8e64d57653feaa5bf5d8b6a1abfd0a3eda0b4e9ad4a5bc0ff181cb5f52412"} Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.976126 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.977314 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sgmrx" podStartSLOduration=7.977283208 podStartE2EDuration="7.977283208s" podCreationTimestamp="2025-10-03 09:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:41.929988266 +0000 UTC m=+123.726620143" watchObservedRunningTime="2025-10-03 09:45:41.977283208 +0000 UTC m=+123.773915065" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.978341 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-l96ff" podStartSLOduration=102.978331835 podStartE2EDuration="1m42.978331835s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:41.97583969 +0000 UTC m=+123.772471547" watchObservedRunningTime="2025-10-03 09:45:41.978331835 +0000 UTC m=+123.774963692" Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.991254 4990 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-h2n67 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Oct 03 09:45:41 crc kubenswrapper[4990]: I1003 09:45:41.991303 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" podUID="130e5354-c1dd-4314-a44a-21e77603424e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.000888 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" event={"ID":"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6","Type":"ContainerStarted","Data":"fa89a14e69c1d2b2592b71e5f4c7739f153ddf5c612661ea62205c3950eb35ab"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.000947 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" event={"ID":"dc59e860-cd3f-40b8-bc9a-a26ae53e89d6","Type":"ContainerStarted","Data":"e62d004b04897b7a0a12ec6f3b5f3e30dc7a96cd2535ca2dbf4c6119ee871759"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.001808 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.011328 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" event={"ID":"1469c562-1020-4d06-8018-6fd61392d855","Type":"ContainerStarted","Data":"1c51a3f12d4cd8d05f26031e126f694a3ad4445e0f691cdbe8d0c915d9aebe22"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.018749 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" podStartSLOduration=103.018731777 podStartE2EDuration="1m43.018731777s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.018353727 +0000 UTC m=+123.814985584" watchObservedRunningTime="2025-10-03 09:45:42.018731777 +0000 UTC m=+123.815363634" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.020799 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.022393 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.522370962 +0000 UTC m=+124.319002829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.027332 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" event={"ID":"5bd5d267-6907-4d32-9620-dd6270e911f7","Type":"ContainerStarted","Data":"2c7968d26ad6d4e4669638e81bfd2a046923998428660605182d899d83fe292d"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.040917 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4dklv" podStartSLOduration=103.040895494 podStartE2EDuration="1m43.040895494s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.040049352 +0000 UTC m=+123.836681209" watchObservedRunningTime="2025-10-03 09:45:42.040895494 +0000 UTC m=+123.837527351" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.045013 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69dts" event={"ID":"92e83dbe-2e24-432d-9a97-1a15a829b009","Type":"ContainerStarted","Data":"03a1581000c419550f3d2721916f178e40194668cef5786cfe546d9dde4308b3"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.060696 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" event={"ID":"e53bb259-c96c-4369-bc00-afbbf9a13eef","Type":"ContainerStarted","Data":"f99ab6e01f82c61cac115b2fff10bc16b00d2c16d417ecdb34759b4e77aea36b"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.062029 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.062456 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" podStartSLOduration=103.062440295 podStartE2EDuration="1m43.062440295s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.061190533 +0000 UTC m=+123.857822400" watchObservedRunningTime="2025-10-03 09:45:42.062440295 +0000 UTC m=+123.859072152" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.065932 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" event={"ID":"a42cdb61-90ff-40c9-8b18-95b86f1dfc3a","Type":"ContainerStarted","Data":"a2a6099e7ff8fdacfc3630c410663e835ec6d20b9120aca229200ecba344586f"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.073354 4990 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-6mnsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.073420 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" podUID="e53bb259-c96c-4369-bc00-afbbf9a13eef" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.089541 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" event={"ID":"554eeba0-898e-4fd2-9308-2e3028b4e8ae","Type":"ContainerStarted","Data":"4c94d0df5d240ba8b434301d0b924b88cab58e89f55bfeb883f78287f6e5e606"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.106321 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7qwq7" podStartSLOduration=103.106295047 podStartE2EDuration="1m43.106295047s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.098212197 +0000 UTC m=+123.894844054" watchObservedRunningTime="2025-10-03 09:45:42.106295047 +0000 UTC m=+123.902926914" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.111082 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" event={"ID":"0c77dc95-2e7f-47c7-821d-c94a13f18f32","Type":"ContainerStarted","Data":"f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.112417 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.114101 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gsvpm" event={"ID":"3f47486b-dcdc-457c-b6d1-2956bc3155f0","Type":"ContainerStarted","Data":"5ce858cc25f76b99c03b44b75a36ec1ed71eba9d0bcb2777f255a6a315fadbb8"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.114150 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gsvpm" event={"ID":"3f47486b-dcdc-457c-b6d1-2956bc3155f0","Type":"ContainerStarted","Data":"cda9b32f15d1ccefbd1a8c48d7d67b7da25bffab05d523939908ca1fb12ab616"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.114799 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.116173 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" event={"ID":"509b877d-1d79-4882-96f7-725d81a000cd","Type":"ContainerStarted","Data":"1b11e4141d60ff1f899e21a3b7224b3110e8363fa15da8a21d82aca01c1dcd76"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.123278 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.123586 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.623573257 +0000 UTC m=+124.420205114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.126849 4990 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-prdsm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.126894 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" podUID="0c77dc95-2e7f-47c7-821d-c94a13f18f32" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.138858 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" event={"ID":"4a6095bd-ea6a-4e7b-8504-89aa4704a720","Type":"ContainerStarted","Data":"8f24706db8d328f56ac961addc5195aa8bee5d01a3bc2bae07d0a04519a70109"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.155390 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rwrlt" event={"ID":"093acc63-fa82-4f09-a668-95aea75f0352","Type":"ContainerStarted","Data":"af90fee743caa08650814617d70f4460673364cd17dcbcca73f6e7d1148f526f"} Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.158637 4990 patch_prober.go:28] interesting pod/console-operator-58897d9998-zr9m5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.158682 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" podUID="3a6e6b23-3c66-42a4-8f01-c8605efe1412" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.177954 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" podStartSLOduration=42.177931533 podStartE2EDuration="42.177931533s" podCreationTimestamp="2025-10-03 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.137565232 +0000 UTC m=+123.934197089" watchObservedRunningTime="2025-10-03 09:45:42.177931533 +0000 UTC m=+123.974563400" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.180180 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gsvpm" podStartSLOduration=8.180164961 podStartE2EDuration="8.180164961s" podCreationTimestamp="2025-10-03 09:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.176530846 +0000 UTC m=+123.973162703" watchObservedRunningTime="2025-10-03 09:45:42.180164961 +0000 UTC m=+123.976796818" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.185270 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6vmzk" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.226139 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.228614 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.728592132 +0000 UTC m=+124.525223999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.260916 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:42 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:42 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:42 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.260975 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.298921 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r8p2v" podStartSLOduration=103.298902023 podStartE2EDuration="1m43.298902023s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.209103165 +0000 UTC m=+124.005735032" watchObservedRunningTime="2025-10-03 09:45:42.298902023 +0000 UTC m=+124.095533880" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.330133 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.330496 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.830480055 +0000 UTC m=+124.627111912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.352307 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" podStartSLOduration=103.352287913 podStartE2EDuration="1m43.352287913s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.351803931 +0000 UTC m=+124.148435798" watchObservedRunningTime="2025-10-03 09:45:42.352287913 +0000 UTC m=+124.148919770" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.352829 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" podStartSLOduration=103.352823287 podStartE2EDuration="1m43.352823287s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.30108549 +0000 UTC m=+124.097717347" watchObservedRunningTime="2025-10-03 09:45:42.352823287 +0000 UTC m=+124.149455144" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.384166 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-s22jw" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.399203 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-29mz5" podStartSLOduration=103.399181574 podStartE2EDuration="1m43.399181574s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.398361533 +0000 UTC m=+124.194993390" watchObservedRunningTime="2025-10-03 09:45:42.399181574 +0000 UTC m=+124.195813441" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.431812 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.432199 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:42.932184084 +0000 UTC m=+124.728815941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.437685 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-llj5c" podStartSLOduration=103.437665857 podStartE2EDuration="1m43.437665857s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.434804332 +0000 UTC m=+124.231436189" watchObservedRunningTime="2025-10-03 09:45:42.437665857 +0000 UTC m=+124.234297704" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.533873 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.534291 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.034278563 +0000 UTC m=+124.830910410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.634934 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.635189 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.135149949 +0000 UTC m=+124.931781806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.635278 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.635759 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.135744325 +0000 UTC m=+124.932376182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.711070 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" podStartSLOduration=103.711050686 podStartE2EDuration="1m43.711050686s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:42.648785174 +0000 UTC m=+124.445417031" watchObservedRunningTime="2025-10-03 09:45:42.711050686 +0000 UTC m=+124.507682543" Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.736438 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.736680 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.236644732 +0000 UTC m=+125.033276589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.736919 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.737270 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.237255088 +0000 UTC m=+125.033886945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.838716 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.838875 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.338846004 +0000 UTC m=+125.135477861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.838926 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.839365 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.339350157 +0000 UTC m=+125.135982014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.940733 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.940937 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.440906502 +0000 UTC m=+125.237538359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:42 crc kubenswrapper[4990]: I1003 09:45:42.941013 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:42 crc kubenswrapper[4990]: E1003 09:45:42.941579 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.441572299 +0000 UTC m=+125.238204156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.042812 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.043013 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.54298314 +0000 UTC m=+125.339614997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.043258 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.043692 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.543675898 +0000 UTC m=+125.340307835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.144147 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.144379 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.644346149 +0000 UTC m=+125.440978006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.144433 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.144752 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.64474049 +0000 UTC m=+125.441372347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.156836 4990 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wnxdn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": context deadline exceeded" start-of-body= Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.156931 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" podUID="cd1a02a7-9b7a-417f-a2b7-7421705a3010" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": context deadline exceeded" Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.164773 4990 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-prdsm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.164823 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" podUID="0c77dc95-2e7f-47c7-821d-c94a13f18f32" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.165262 4990 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-h2n67 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.165290 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" podUID="130e5354-c1dd-4314-a44a-21e77603424e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.183099 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6mnsf" Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.246933 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.248832 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.74881206 +0000 UTC m=+125.545443927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.278001 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:43 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:43 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:43 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.278073 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.350273 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.350690 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.850661792 +0000 UTC m=+125.647293649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.451010 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.451466 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:43.951450027 +0000 UTC m=+125.748081884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.503626 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zr9m5" Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.557642 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.558064 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.058044323 +0000 UTC m=+125.854676180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.658702 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.658838 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.158813047 +0000 UTC m=+125.955444904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.659222 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.659816 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.159807063 +0000 UTC m=+125.956438920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.760580 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.760971 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.260955207 +0000 UTC m=+126.057587064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.862624 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.863271 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.363257511 +0000 UTC m=+126.159889368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:43 crc kubenswrapper[4990]: I1003 09:45:43.964272 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:43 crc kubenswrapper[4990]: E1003 09:45:43.964678 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.464662341 +0000 UTC m=+126.261294198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.066291 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.066733 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.566714359 +0000 UTC m=+126.363346216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.101252 4990 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.167343 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.167930 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.667909244 +0000 UTC m=+126.464541101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.171005 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69dts" event={"ID":"92e83dbe-2e24-432d-9a97-1a15a829b009","Type":"ContainerStarted","Data":"cb8284bc973a2087e090672f3a5160145fa539fc2c44c6fbfe7cf53cc93bd51e"} Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.171062 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69dts" event={"ID":"92e83dbe-2e24-432d-9a97-1a15a829b009","Type":"ContainerStarted","Data":"98ae0cc208255cf946de82b9706b27493ed1404e999ec5aeea273ea08f114603"} Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.171075 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-69dts" event={"ID":"92e83dbe-2e24-432d-9a97-1a15a829b009","Type":"ContainerStarted","Data":"43c93efb97647415d6f725a3e05da89ba25674f5b8663b682660d1af453e21cf"} Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.193191 4990 generic.go:334] "Generic (PLEG): container finished" podID="5bd5d267-6907-4d32-9620-dd6270e911f7" containerID="2c7968d26ad6d4e4669638e81bfd2a046923998428660605182d899d83fe292d" exitCode=0 Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.194155 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" event={"ID":"5bd5d267-6907-4d32-9620-dd6270e911f7","Type":"ContainerDied","Data":"2c7968d26ad6d4e4669638e81bfd2a046923998428660605182d899d83fe292d"} Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.226184 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-69dts" podStartSLOduration=10.226157421 podStartE2EDuration="10.226157421s" podCreationTimestamp="2025-10-03 09:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:44.209092136 +0000 UTC m=+126.005724013" watchObservedRunningTime="2025-10-03 09:45:44.226157421 +0000 UTC m=+126.022789278" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.239397 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.258174 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:44 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:44 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:44 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.258255 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.269291 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.270127 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.770103375 +0000 UTC m=+126.566735232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.372869 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.373107 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.873070067 +0000 UTC m=+126.669701934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.373195 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lfgbn"] Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.373251 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.373836 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.873816396 +0000 UTC m=+126.670448253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.374564 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.378031 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.396441 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfgbn"] Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.474490 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.474844 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-catalog-content\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.474879 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-852qj\" (UniqueName: \"kubernetes.io/projected/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-kube-api-access-852qj\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.474941 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-utilities\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.475208 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:44.975187065 +0000 UTC m=+126.771818922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.497738 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h2n67" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.573238 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4rgvf"] Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.574492 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.575991 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-catalog-content\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.576037 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-852qj\" (UniqueName: \"kubernetes.io/projected/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-kube-api-access-852qj\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.576088 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-utilities\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.576148 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.576547 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:45.076533064 +0000 UTC m=+126.873164921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.576816 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.577204 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-catalog-content\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.577473 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-utilities\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.596378 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rgvf"] Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.610600 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-852qj\" (UniqueName: \"kubernetes.io/projected/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-kube-api-access-852qj\") pod \"community-operators-lfgbn\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.677286 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.677561 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:45.177475103 +0000 UTC m=+126.974106950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.677691 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.677862 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-utilities\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.677892 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns7mh\" (UniqueName: \"kubernetes.io/projected/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-kube-api-access-ns7mh\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.677913 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-catalog-content\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.678119 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:45.178102929 +0000 UTC m=+126.974734856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.691208 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.763905 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7297n"] Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.765016 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.779138 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.779392 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-utilities\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.779418 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-catalog-content\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.779434 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns7mh\" (UniqueName: \"kubernetes.io/projected/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-kube-api-access-ns7mh\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.779853 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:45.279837538 +0000 UTC m=+127.076469395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.780219 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-utilities\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.780476 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-catalog-content\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.783661 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7297n"] Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.807475 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns7mh\" (UniqueName: \"kubernetes.io/projected/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-kube-api-access-ns7mh\") pod \"certified-operators-4rgvf\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.880546 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.880847 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-utilities\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.880865 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9fns\" (UniqueName: \"kubernetes.io/projected/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-kube-api-access-r9fns\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.880922 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-catalog-content\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.881274 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 09:45:45.381259819 +0000 UTC m=+127.177891676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hmdg" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.888541 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.971159 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rkwwz"] Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.972256 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.979134 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rkwwz"] Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.988309 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.988609 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-catalog-content\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.988734 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-utilities\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.988767 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9fns\" (UniqueName: \"kubernetes.io/projected/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-kube-api-access-r9fns\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: E1003 09:45:44.989327 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 09:45:45.489306463 +0000 UTC m=+127.285938330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.989804 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-catalog-content\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.990031 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-utilities\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:44 crc kubenswrapper[4990]: I1003 09:45:44.992631 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfgbn"] Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.013677 4990 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T09:45:44.101292919Z","Handler":null,"Name":""} Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.027248 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9fns\" (UniqueName: \"kubernetes.io/projected/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-kube-api-access-r9fns\") pod \"community-operators-7297n\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.047769 4990 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.047824 4990 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.090597 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7297n" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.091187 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnxrx\" (UniqueName: \"kubernetes.io/projected/5f863597-24b7-4013-b78f-daee3c406a29-kube-api-access-tnxrx\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.091303 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.091331 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-catalog-content\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.091366 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-utilities\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.094804 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.094858 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.098183 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.106146 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.112795 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.118293 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.120015 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.159183 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hmdg\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.194989 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.195335 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.195411 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.195461 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnxrx\" (UniqueName: \"kubernetes.io/projected/5f863597-24b7-4013-b78f-daee3c406a29-kube-api-access-tnxrx\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.195533 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-catalog-content\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.195562 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-utilities\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.196709 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-catalog-content\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.197660 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-utilities\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.208992 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.236720 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnxrx\" (UniqueName: \"kubernetes.io/projected/5f863597-24b7-4013-b78f-daee3c406a29-kube-api-access-tnxrx\") pod \"certified-operators-rkwwz\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.265462 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfgbn" event={"ID":"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a","Type":"ContainerStarted","Data":"06e210acbf4e8997bfd470bd98064dec0906f35790680683c578200d6fbe6c0a"} Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.267359 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:45 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:45 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:45 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.267416 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.296704 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.297121 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.297241 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.327068 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.327776 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.329869 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.332737 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.335211 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rgvf"] Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.448809 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.591521 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7297n"] Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.773192 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.830106 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.906189 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bd5d267-6907-4d32-9620-dd6270e911f7-config-volume\") pod \"5bd5d267-6907-4d32-9620-dd6270e911f7\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.906268 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccn72\" (UniqueName: \"kubernetes.io/projected/5bd5d267-6907-4d32-9620-dd6270e911f7-kube-api-access-ccn72\") pod \"5bd5d267-6907-4d32-9620-dd6270e911f7\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.906347 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bd5d267-6907-4d32-9620-dd6270e911f7-secret-volume\") pod \"5bd5d267-6907-4d32-9620-dd6270e911f7\" (UID: \"5bd5d267-6907-4d32-9620-dd6270e911f7\") " Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.907085 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bd5d267-6907-4d32-9620-dd6270e911f7-config-volume" (OuterVolumeSpecName: "config-volume") pod "5bd5d267-6907-4d32-9620-dd6270e911f7" (UID: "5bd5d267-6907-4d32-9620-dd6270e911f7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.924666 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd5d267-6907-4d32-9620-dd6270e911f7-kube-api-access-ccn72" (OuterVolumeSpecName: "kube-api-access-ccn72") pod "5bd5d267-6907-4d32-9620-dd6270e911f7" (UID: "5bd5d267-6907-4d32-9620-dd6270e911f7"). InnerVolumeSpecName "kube-api-access-ccn72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.925339 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd5d267-6907-4d32-9620-dd6270e911f7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5bd5d267-6907-4d32-9620-dd6270e911f7" (UID: "5bd5d267-6907-4d32-9620-dd6270e911f7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.962214 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rkwwz"] Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.967568 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hmdg"] Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.975548 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.975590 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.977320 4990 patch_prober.go:28] interesting pod/console-f9d7485db-5jgtm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 03 09:45:45 crc kubenswrapper[4990]: I1003 09:45:45.977360 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5jgtm" podUID="ee5a405d-75fe-4968-881c-62d8c6d0dd5a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.007737 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bd5d267-6907-4d32-9620-dd6270e911f7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.007771 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bd5d267-6907-4d32-9620-dd6270e911f7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.007788 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccn72\" (UniqueName: \"kubernetes.io/projected/5bd5d267-6907-4d32-9620-dd6270e911f7-kube-api-access-ccn72\") on node \"crc\" DevicePath \"\"" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.114553 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.114619 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.122062 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.130115 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.136976 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tjqxd" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.268402 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:46 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:46 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:46 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.268526 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.285415 4990 generic.go:334] "Generic (PLEG): container finished" podID="5f863597-24b7-4013-b78f-daee3c406a29" containerID="90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009" exitCode=0 Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.285673 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkwwz" event={"ID":"5f863597-24b7-4013-b78f-daee3c406a29","Type":"ContainerDied","Data":"90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.285756 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkwwz" event={"ID":"5f863597-24b7-4013-b78f-daee3c406a29","Type":"ContainerStarted","Data":"bb3d97cfed9c46567a89ce3e00d271c2a909c2c3968b1b64127cf86af8a2454c"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.287399 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.291581 4990 generic.go:334] "Generic (PLEG): container finished" podID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerID="ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe" exitCode=0 Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.291677 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfgbn" event={"ID":"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a","Type":"ContainerDied","Data":"ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.293534 4990 patch_prober.go:28] interesting pod/downloads-7954f5f757-f5nfx container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.293586 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-f5nfx" podUID="60f1ddd8-ebdb-4575-b06e-619cbe196937" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.294022 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" event={"ID":"5bd5d267-6907-4d32-9620-dd6270e911f7","Type":"ContainerDied","Data":"aa72d1f3a494eea8f66a21ee3f8a34a7d069cb4fe649c96606ac56dac4b84cb4"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.294073 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa72d1f3a494eea8f66a21ee3f8a34a7d069cb4fe649c96606ac56dac4b84cb4" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.294147 4990 patch_prober.go:28] interesting pod/downloads-7954f5f757-f5nfx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.294264 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f5nfx" podUID="60f1ddd8-ebdb-4575-b06e-619cbe196937" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.294213 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.296059 4990 generic.go:334] "Generic (PLEG): container finished" podID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerID="394dffff0a3e8b18af0c41817fa2c5a5ad728562fece6facda0b17558a1c9e02" exitCode=0 Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.296790 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7297n" event={"ID":"c25aa9b7-d693-4efb-80c0-28c7d44f4c59","Type":"ContainerDied","Data":"394dffff0a3e8b18af0c41817fa2c5a5ad728562fece6facda0b17558a1c9e02"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.296837 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7297n" event={"ID":"c25aa9b7-d693-4efb-80c0-28c7d44f4c59","Type":"ContainerStarted","Data":"7a613309d95e2daf57e8845de737161bfe24776f025ccd0e310e9b6a15d6d810"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.305052 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0a0c75a4-e37d-43b0-8386-c771aa5044ed","Type":"ContainerStarted","Data":"c90c28947104ba972cc4f76cbae565ee42a9fac65dcb447752d69cc7edef74cf"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.305109 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0a0c75a4-e37d-43b0-8386-c771aa5044ed","Type":"ContainerStarted","Data":"edbd38a86ac7aa38de9e61f8a23a7d80d20ad680075d95cc76cc04f883ddc14d"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.307116 4990 generic.go:334] "Generic (PLEG): container finished" podID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerID="0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942" exitCode=0 Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.307258 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rgvf" event={"ID":"0462ac94-e1a1-456b-89c7-a3a19eb8b80c","Type":"ContainerDied","Data":"0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.307296 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rgvf" event={"ID":"0462ac94-e1a1-456b-89c7-a3a19eb8b80c","Type":"ContainerStarted","Data":"c9f095e25895460af208577df0aa7d1fa68724bec0ae652e05bac276677fae48"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.312243 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" event={"ID":"becdef88-76d3-402a-b26f-23a4cbdf1644","Type":"ContainerStarted","Data":"4847044c81b2e5d05d9f6c1f20532a81232b5a99b0ab18b730f8309def921ef0"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.312287 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" event={"ID":"becdef88-76d3-402a-b26f-23a4cbdf1644","Type":"ContainerStarted","Data":"6dfdaab6f1e077bd805538d3cc8e1e57e78fdf9f0006e00636b7b799d99cdc97"} Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.312303 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.316662 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zjl7g" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.402304 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.402279598 podStartE2EDuration="1.402279598s" podCreationTimestamp="2025-10-03 09:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:46.380818949 +0000 UTC m=+128.177450816" watchObservedRunningTime="2025-10-03 09:45:46.402279598 +0000 UTC m=+128.198911455" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.419968 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" podStartSLOduration=107.419940668 podStartE2EDuration="1m47.419940668s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:46.416809887 +0000 UTC m=+128.213441754" watchObservedRunningTime="2025-10-03 09:45:46.419940668 +0000 UTC m=+128.216572525" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.564174 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-99zt5"] Oct 03 09:45:46 crc kubenswrapper[4990]: E1003 09:45:46.564715 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd5d267-6907-4d32-9620-dd6270e911f7" containerName="collect-profiles" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.564729 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd5d267-6907-4d32-9620-dd6270e911f7" containerName="collect-profiles" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.564829 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd5d267-6907-4d32-9620-dd6270e911f7" containerName="collect-profiles" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.565582 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.568395 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.619357 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-utilities\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.619435 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl9kz\" (UniqueName: \"kubernetes.io/projected/75f571ee-03df-47ad-9c14-7b61ec0ed691-kube-api-access-zl9kz\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.619501 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-catalog-content\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.633440 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99zt5"] Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.720796 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-utilities\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.720885 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl9kz\" (UniqueName: \"kubernetes.io/projected/75f571ee-03df-47ad-9c14-7b61ec0ed691-kube-api-access-zl9kz\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.720928 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-catalog-content\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.721682 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-catalog-content\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.721886 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-utilities\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.757101 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl9kz\" (UniqueName: \"kubernetes.io/projected/75f571ee-03df-47ad-9c14-7b61ec0ed691-kube-api-access-zl9kz\") pod \"redhat-marketplace-99zt5\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.882486 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.883937 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.966673 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qsxcm"] Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.968383 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:46 crc kubenswrapper[4990]: I1003 09:45:46.999432 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsxcm"] Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.025973 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6m2n\" (UniqueName: \"kubernetes.io/projected/fa3df226-ebca-4e5a-b166-778b6c69b247-kube-api-access-q6m2n\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.026094 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-utilities\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.026199 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-catalog-content\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.128792 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6m2n\" (UniqueName: \"kubernetes.io/projected/fa3df226-ebca-4e5a-b166-778b6c69b247-kube-api-access-q6m2n\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.128871 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-utilities\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.128967 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-catalog-content\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.130176 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-catalog-content\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.130543 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-utilities\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.184426 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6m2n\" (UniqueName: \"kubernetes.io/projected/fa3df226-ebca-4e5a-b166-778b6c69b247-kube-api-access-q6m2n\") pod \"redhat-marketplace-qsxcm\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.214266 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.254381 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.264749 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:47 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:47 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:47 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.264815 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.275713 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99zt5"] Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.307797 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.336037 4990 generic.go:334] "Generic (PLEG): container finished" podID="0a0c75a4-e37d-43b0-8386-c771aa5044ed" containerID="c90c28947104ba972cc4f76cbae565ee42a9fac65dcb447752d69cc7edef74cf" exitCode=0 Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.336729 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0a0c75a4-e37d-43b0-8386-c771aa5044ed","Type":"ContainerDied","Data":"c90c28947104ba972cc4f76cbae565ee42a9fac65dcb447752d69cc7edef74cf"} Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.566055 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jcp8t"] Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.567928 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.581175 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.593601 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jcp8t"] Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.655605 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdxsl\" (UniqueName: \"kubernetes.io/projected/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-kube-api-access-jdxsl\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.655699 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-catalog-content\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.655727 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-utilities\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.714642 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsxcm"] Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.757840 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdxsl\" (UniqueName: \"kubernetes.io/projected/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-kube-api-access-jdxsl\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.757949 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-catalog-content\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.757971 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-utilities\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.760238 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-utilities\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.764455 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-catalog-content\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: W1003 09:45:47.766610 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa3df226_ebca_4e5a_b166_778b6c69b247.slice/crio-74616c7116f104fe4b9af1e027745f459d35fc9a85a33b314f68314f56de4034 WatchSource:0}: Error finding container 74616c7116f104fe4b9af1e027745f459d35fc9a85a33b314f68314f56de4034: Status 404 returned error can't find the container with id 74616c7116f104fe4b9af1e027745f459d35fc9a85a33b314f68314f56de4034 Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.811989 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdxsl\" (UniqueName: \"kubernetes.io/projected/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-kube-api-access-jdxsl\") pod \"redhat-operators-jcp8t\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.959314 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.981651 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pfgr8"] Oct 03 09:45:47 crc kubenswrapper[4990]: I1003 09:45:47.982765 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.005886 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pfgr8"] Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.062581 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-catalog-content\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.062677 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzc2\" (UniqueName: \"kubernetes.io/projected/7455e5b9-3e7a-4d4b-bf71-2b5767373598-kube-api-access-qrzc2\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.063023 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-utilities\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.164193 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-utilities\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.164644 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzc2\" (UniqueName: \"kubernetes.io/projected/7455e5b9-3e7a-4d4b-bf71-2b5767373598-kube-api-access-qrzc2\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.164665 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-catalog-content\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.165119 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-utilities\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.165405 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-catalog-content\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.186590 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzc2\" (UniqueName: \"kubernetes.io/projected/7455e5b9-3e7a-4d4b-bf71-2b5767373598-kube-api-access-qrzc2\") pod \"redhat-operators-pfgr8\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.261188 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:48 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:48 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:48 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.261252 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.316256 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jcp8t"] Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.345282 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsxcm" event={"ID":"fa3df226-ebca-4e5a-b166-778b6c69b247","Type":"ContainerStarted","Data":"74616c7116f104fe4b9af1e027745f459d35fc9a85a33b314f68314f56de4034"} Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.347410 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcp8t" event={"ID":"2fdc3ca0-0b83-4b64-bd5c-985658582ae3","Type":"ContainerStarted","Data":"9ac21dda45428f62266c3560601a550e21a68f6b1fa560192dfc6d228da242bb"} Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.352739 4990 generic.go:334] "Generic (PLEG): container finished" podID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerID="196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854" exitCode=0 Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.352801 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99zt5" event={"ID":"75f571ee-03df-47ad-9c14-7b61ec0ed691","Type":"ContainerDied","Data":"196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854"} Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.352839 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99zt5" event={"ID":"75f571ee-03df-47ad-9c14-7b61ec0ed691","Type":"ContainerStarted","Data":"72c0d1f53540f638fdef58401ddc8b3e8b9bac6fdcddf09006780a4ef819b552"} Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.360141 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.648249 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pfgr8"] Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.758971 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.887698 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kubelet-dir\") pod \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\" (UID: \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\") " Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.887961 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kube-api-access\") pod \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\" (UID: \"0a0c75a4-e37d-43b0-8386-c771aa5044ed\") " Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.889874 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0a0c75a4-e37d-43b0-8386-c771aa5044ed" (UID: "0a0c75a4-e37d-43b0-8386-c771aa5044ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:45:48 crc kubenswrapper[4990]: I1003 09:45:48.895479 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0a0c75a4-e37d-43b0-8386-c771aa5044ed" (UID: "0a0c75a4-e37d-43b0-8386-c771aa5044ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.024537 4990 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.024590 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a0c75a4-e37d-43b0-8386-c771aa5044ed-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.261794 4990 patch_prober.go:28] interesting pod/router-default-5444994796-hdl6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 09:45:49 crc kubenswrapper[4990]: [-]has-synced failed: reason withheld Oct 03 09:45:49 crc kubenswrapper[4990]: [+]process-running ok Oct 03 09:45:49 crc kubenswrapper[4990]: healthz check failed Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.262190 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hdl6b" podUID="b93f3c26-69a5-4220-956a-2f6bbf884c9a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.383918 4990 generic.go:334] "Generic (PLEG): container finished" podID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerID="c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620" exitCode=0 Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.384029 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsxcm" event={"ID":"fa3df226-ebca-4e5a-b166-778b6c69b247","Type":"ContainerDied","Data":"c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620"} Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.402896 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfgr8" event={"ID":"7455e5b9-3e7a-4d4b-bf71-2b5767373598","Type":"ContainerStarted","Data":"aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642"} Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.402968 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 09:45:49 crc kubenswrapper[4990]: E1003 09:45:49.403262 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0c75a4-e37d-43b0-8386-c771aa5044ed" containerName="pruner" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.403278 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0c75a4-e37d-43b0-8386-c771aa5044ed" containerName="pruner" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.403607 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0c75a4-e37d-43b0-8386-c771aa5044ed" containerName="pruner" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.407040 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfgr8" event={"ID":"7455e5b9-3e7a-4d4b-bf71-2b5767373598","Type":"ContainerStarted","Data":"eb504bdf99d2963c37c271b07bc65f4ee9dea009e60471fde2e750b1a162f25b"} Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.407176 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.410698 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.410926 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.417862 4990 generic.go:334] "Generic (PLEG): container finished" podID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerID="965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200" exitCode=0 Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.418053 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcp8t" event={"ID":"2fdc3ca0-0b83-4b64-bd5c-985658582ae3","Type":"ContainerDied","Data":"965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200"} Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.425559 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.431270 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0a0c75a4-e37d-43b0-8386-c771aa5044ed","Type":"ContainerDied","Data":"edbd38a86ac7aa38de9e61f8a23a7d80d20ad680075d95cc76cc04f883ddc14d"} Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.431314 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edbd38a86ac7aa38de9e61f8a23a7d80d20ad680075d95cc76cc04f883ddc14d" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.431410 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.539837 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edd039c2-d54b-4009-b0af-833dd18733c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"edd039c2-d54b-4009-b0af-833dd18733c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.539960 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd039c2-d54b-4009-b0af-833dd18733c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"edd039c2-d54b-4009-b0af-833dd18733c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.640861 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd039c2-d54b-4009-b0af-833dd18733c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"edd039c2-d54b-4009-b0af-833dd18733c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.640970 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edd039c2-d54b-4009-b0af-833dd18733c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"edd039c2-d54b-4009-b0af-833dd18733c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.641051 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edd039c2-d54b-4009-b0af-833dd18733c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"edd039c2-d54b-4009-b0af-833dd18733c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.691007 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd039c2-d54b-4009-b0af-833dd18733c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"edd039c2-d54b-4009-b0af-833dd18733c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:45:49 crc kubenswrapper[4990]: I1003 09:45:49.738397 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:45:50 crc kubenswrapper[4990]: I1003 09:45:50.081547 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 09:45:50 crc kubenswrapper[4990]: I1003 09:45:50.259329 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:50 crc kubenswrapper[4990]: I1003 09:45:50.263134 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hdl6b" Oct 03 09:45:50 crc kubenswrapper[4990]: I1003 09:45:50.475212 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"edd039c2-d54b-4009-b0af-833dd18733c0","Type":"ContainerStarted","Data":"c1331faa1040b4cb54bc67eaf354a1008ca46870fbea401cf43ede946e10a168"} Oct 03 09:45:50 crc kubenswrapper[4990]: I1003 09:45:50.480830 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfgr8" event={"ID":"7455e5b9-3e7a-4d4b-bf71-2b5767373598","Type":"ContainerDied","Data":"aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642"} Oct 03 09:45:50 crc kubenswrapper[4990]: I1003 09:45:50.481465 4990 generic.go:334] "Generic (PLEG): container finished" podID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerID="aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642" exitCode=0 Oct 03 09:45:51 crc kubenswrapper[4990]: I1003 09:45:51.513052 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"edd039c2-d54b-4009-b0af-833dd18733c0","Type":"ContainerStarted","Data":"6d9f609965e9be481ac6f23a635fff643b8796b264ca49e25cf6e5e85820b9a2"} Oct 03 09:45:52 crc kubenswrapper[4990]: I1003 09:45:52.438995 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gsvpm" Oct 03 09:45:52 crc kubenswrapper[4990]: I1003 09:45:52.544305 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.54426912 podStartE2EDuration="3.54426912s" podCreationTimestamp="2025-10-03 09:45:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:45:52.543919131 +0000 UTC m=+134.340550998" watchObservedRunningTime="2025-10-03 09:45:52.54426912 +0000 UTC m=+134.340900977" Oct 03 09:45:53 crc kubenswrapper[4990]: I1003 09:45:53.537603 4990 generic.go:334] "Generic (PLEG): container finished" podID="edd039c2-d54b-4009-b0af-833dd18733c0" containerID="6d9f609965e9be481ac6f23a635fff643b8796b264ca49e25cf6e5e85820b9a2" exitCode=0 Oct 03 09:45:53 crc kubenswrapper[4990]: I1003 09:45:53.537659 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"edd039c2-d54b-4009-b0af-833dd18733c0","Type":"ContainerDied","Data":"6d9f609965e9be481ac6f23a635fff643b8796b264ca49e25cf6e5e85820b9a2"} Oct 03 09:45:55 crc kubenswrapper[4990]: I1003 09:45:55.981024 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:55 crc kubenswrapper[4990]: I1003 09:45:55.988715 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:45:56 crc kubenswrapper[4990]: I1003 09:45:56.300359 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-f5nfx" Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.545666 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.592384 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"edd039c2-d54b-4009-b0af-833dd18733c0","Type":"ContainerDied","Data":"c1331faa1040b4cb54bc67eaf354a1008ca46870fbea401cf43ede946e10a168"} Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.592432 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1331faa1040b4cb54bc67eaf354a1008ca46870fbea401cf43ede946e10a168" Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.592475 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.672033 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edd039c2-d54b-4009-b0af-833dd18733c0-kubelet-dir\") pod \"edd039c2-d54b-4009-b0af-833dd18733c0\" (UID: \"edd039c2-d54b-4009-b0af-833dd18733c0\") " Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.672151 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd039c2-d54b-4009-b0af-833dd18733c0-kube-api-access\") pod \"edd039c2-d54b-4009-b0af-833dd18733c0\" (UID: \"edd039c2-d54b-4009-b0af-833dd18733c0\") " Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.672158 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edd039c2-d54b-4009-b0af-833dd18733c0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "edd039c2-d54b-4009-b0af-833dd18733c0" (UID: "edd039c2-d54b-4009-b0af-833dd18733c0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.672533 4990 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/edd039c2-d54b-4009-b0af-833dd18733c0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.677601 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd039c2-d54b-4009-b0af-833dd18733c0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "edd039c2-d54b-4009-b0af-833dd18733c0" (UID: "edd039c2-d54b-4009-b0af-833dd18733c0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:46:01 crc kubenswrapper[4990]: I1003 09:46:01.773545 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/edd039c2-d54b-4009-b0af-833dd18733c0-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 09:46:05 crc kubenswrapper[4990]: I1003 09:46:05.339260 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.862859 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.863007 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.863104 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.863167 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.865125 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.866041 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.866251 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.875397 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.884876 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.889357 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.890040 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:46:07 crc kubenswrapper[4990]: I1003 09:46:07.989156 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 09:46:08 crc kubenswrapper[4990]: I1003 09:46:08.185876 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:46:08 crc kubenswrapper[4990]: I1003 09:46:08.190689 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:46:08 crc kubenswrapper[4990]: I1003 09:46:08.299658 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 09:46:17 crc kubenswrapper[4990]: I1003 09:46:17.349906 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x692t" Oct 03 09:46:21 crc kubenswrapper[4990]: I1003 09:46:21.283772 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:46:21 crc kubenswrapper[4990]: I1003 09:46:21.287623 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 09:46:21 crc kubenswrapper[4990]: I1003 09:46:21.303436 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2a21582-ac04-4caa-a823-7c30c7f788c9-metrics-certs\") pod \"network-metrics-daemon-gdrcw\" (UID: \"b2a21582-ac04-4caa-a823-7c30c7f788c9\") " pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:46:21 crc kubenswrapper[4990]: I1003 09:46:21.512168 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 09:46:21 crc kubenswrapper[4990]: I1003 09:46:21.520449 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gdrcw" Oct 03 09:46:23 crc kubenswrapper[4990]: E1003 09:46:23.427888 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 09:46:23 crc kubenswrapper[4990]: E1003 09:46:23.428359 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrzc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pfgr8_openshift-marketplace(7455e5b9-3e7a-4d4b-bf71-2b5767373598): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 09:46:23 crc kubenswrapper[4990]: E1003 09:46:23.429919 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pfgr8" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" Oct 03 09:46:24 crc kubenswrapper[4990]: E1003 09:46:24.632856 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-pfgr8" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" Oct 03 09:46:24 crc kubenswrapper[4990]: E1003 09:46:24.702984 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 09:46:24 crc kubenswrapper[4990]: E1003 09:46:24.703364 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ns7mh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4rgvf_openshift-marketplace(0462ac94-e1a1-456b-89c7-a3a19eb8b80c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 09:46:24 crc kubenswrapper[4990]: E1003 09:46:24.704697 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4rgvf" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" Oct 03 09:46:25 crc kubenswrapper[4990]: I1003 09:46:25.303974 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:46:25 crc kubenswrapper[4990]: I1003 09:46:25.304484 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:46:25 crc kubenswrapper[4990]: E1003 09:46:25.937901 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4rgvf" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" Oct 03 09:46:26 crc kubenswrapper[4990]: E1003 09:46:26.011680 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 09:46:26 crc kubenswrapper[4990]: E1003 09:46:26.011918 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r9fns,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7297n_openshift-marketplace(c25aa9b7-d693-4efb-80c0-28c7d44f4c59): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 09:46:26 crc kubenswrapper[4990]: E1003 09:46:26.013873 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7297n" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" Oct 03 09:46:26 crc kubenswrapper[4990]: E1003 09:46:26.036818 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 09:46:26 crc kubenswrapper[4990]: E1003 09:46:26.037045 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnxrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rkwwz_openshift-marketplace(5f863597-24b7-4013-b78f-daee3c406a29): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 09:46:26 crc kubenswrapper[4990]: E1003 09:46:26.038536 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rkwwz" podUID="5f863597-24b7-4013-b78f-daee3c406a29" Oct 03 09:46:28 crc kubenswrapper[4990]: E1003 09:46:28.595011 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7297n" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" Oct 03 09:46:28 crc kubenswrapper[4990]: E1003 09:46:28.595193 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rkwwz" podUID="5f863597-24b7-4013-b78f-daee3c406a29" Oct 03 09:46:28 crc kubenswrapper[4990]: E1003 09:46:28.680684 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 09:46:28 crc kubenswrapper[4990]: E1003 09:46:28.680903 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-852qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lfgbn_openshift-marketplace(7a3a86ea-aa6a-4dc4-8aed-794167e95b8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 09:46:28 crc kubenswrapper[4990]: E1003 09:46:28.682111 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lfgbn" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" Oct 03 09:46:28 crc kubenswrapper[4990]: E1003 09:46:28.723693 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 09:46:28 crc kubenswrapper[4990]: E1003 09:46:28.723945 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jdxsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jcp8t_openshift-marketplace(2fdc3ca0-0b83-4b64-bd5c-985658582ae3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 09:46:28 crc kubenswrapper[4990]: E1003 09:46:28.725376 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jcp8t" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.436228 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lfgbn" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.436536 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jcp8t" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.551007 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.551245 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6m2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-qsxcm_openshift-marketplace(fa3df226-ebca-4e5a-b166-778b6c69b247): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.552474 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-qsxcm" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.581120 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.581328 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zl9kz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-99zt5_openshift-marketplace(75f571ee-03df-47ad-9c14-7b61ec0ed691): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.582819 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-99zt5" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.770628 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-99zt5" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" Oct 03 09:46:29 crc kubenswrapper[4990]: E1003 09:46:29.770845 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-qsxcm" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" Oct 03 09:46:29 crc kubenswrapper[4990]: I1003 09:46:29.886926 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gdrcw"] Oct 03 09:46:29 crc kubenswrapper[4990]: W1003 09:46:29.967659 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-fac0b4da0ff31a5e55569cd816c41ce97c3db6e3f54327a0ab2f8fccea2ccc3d WatchSource:0}: Error finding container fac0b4da0ff31a5e55569cd816c41ce97c3db6e3f54327a0ab2f8fccea2ccc3d: Status 404 returned error can't find the container with id fac0b4da0ff31a5e55569cd816c41ce97c3db6e3f54327a0ab2f8fccea2ccc3d Oct 03 09:46:29 crc kubenswrapper[4990]: W1003 09:46:29.983202 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-929468e0f98ffd2e3557952e6e1e7314ac456027e1b88338dcf405dfae6408ad WatchSource:0}: Error finding container 929468e0f98ffd2e3557952e6e1e7314ac456027e1b88338dcf405dfae6408ad: Status 404 returned error can't find the container with id 929468e0f98ffd2e3557952e6e1e7314ac456027e1b88338dcf405dfae6408ad Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.791711 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" event={"ID":"b2a21582-ac04-4caa-a823-7c30c7f788c9","Type":"ContainerStarted","Data":"939641927b812e9e7a5346e09ce90c8e81bb8d1d28959a36aa0baaaad0f80c05"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.792592 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" event={"ID":"b2a21582-ac04-4caa-a823-7c30c7f788c9","Type":"ContainerStarted","Data":"33730f3f65335ee3860361a9e6c034f7b25e92c5eace4e7e35f9f0fdb8a80b88"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.792622 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gdrcw" event={"ID":"b2a21582-ac04-4caa-a823-7c30c7f788c9","Type":"ContainerStarted","Data":"befcff0cd01d91784c4959e747b6eee65eb41df7a01601fc286a50cf8176e423"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.795135 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3b0a6833502525b137e0382c4f9b8948e8479d7b8aef0460c024a2096de0451f"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.795200 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"929468e0f98ffd2e3557952e6e1e7314ac456027e1b88338dcf405dfae6408ad"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.799134 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0dd7b8de7ef6f6f08a23606319349d4abec3ab5ccfc01f4c7064941a07a4ca8e"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.799198 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fac0b4da0ff31a5e55569cd816c41ce97c3db6e3f54327a0ab2f8fccea2ccc3d"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.799487 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.800998 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"93932e013815dd117c7f1ec9605517f3feae59c978334532a92d6a0153e7f4be"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.801030 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"518b9eb13640ec24e972171d8d1531cdb81ab1f4ea50871eb967d78e5c407432"} Oct 03 09:46:30 crc kubenswrapper[4990]: I1003 09:46:30.811573 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gdrcw" podStartSLOduration=151.811549569 podStartE2EDuration="2m31.811549569s" podCreationTimestamp="2025-10-03 09:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:46:30.807700567 +0000 UTC m=+172.604332444" watchObservedRunningTime="2025-10-03 09:46:30.811549569 +0000 UTC m=+172.608181426" Oct 03 09:46:36 crc kubenswrapper[4990]: I1003 09:46:36.841006 4990 generic.go:334] "Generic (PLEG): container finished" podID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerID="3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658" exitCode=0 Oct 03 09:46:36 crc kubenswrapper[4990]: I1003 09:46:36.841094 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfgr8" event={"ID":"7455e5b9-3e7a-4d4b-bf71-2b5767373598","Type":"ContainerDied","Data":"3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658"} Oct 03 09:46:37 crc kubenswrapper[4990]: I1003 09:46:37.852153 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfgr8" event={"ID":"7455e5b9-3e7a-4d4b-bf71-2b5767373598","Type":"ContainerStarted","Data":"6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef"} Oct 03 09:46:37 crc kubenswrapper[4990]: I1003 09:46:37.877549 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pfgr8" podStartSLOduration=4.002638965 podStartE2EDuration="50.877500453s" podCreationTimestamp="2025-10-03 09:45:47 +0000 UTC" firstStartedPulling="2025-10-03 09:45:50.483097351 +0000 UTC m=+132.279729208" lastFinishedPulling="2025-10-03 09:46:37.357958839 +0000 UTC m=+179.154590696" observedRunningTime="2025-10-03 09:46:37.875072879 +0000 UTC m=+179.671704736" watchObservedRunningTime="2025-10-03 09:46:37.877500453 +0000 UTC m=+179.674132330" Oct 03 09:46:38 crc kubenswrapper[4990]: I1003 09:46:38.360771 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:46:38 crc kubenswrapper[4990]: I1003 09:46:38.360869 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:46:38 crc kubenswrapper[4990]: I1003 09:46:38.885978 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rgvf" event={"ID":"0462ac94-e1a1-456b-89c7-a3a19eb8b80c","Type":"ContainerStarted","Data":"c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6"} Oct 03 09:46:39 crc kubenswrapper[4990]: I1003 09:46:39.493586 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pfgr8" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="registry-server" probeResult="failure" output=< Oct 03 09:46:39 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 09:46:39 crc kubenswrapper[4990]: > Oct 03 09:46:39 crc kubenswrapper[4990]: I1003 09:46:39.891446 4990 generic.go:334] "Generic (PLEG): container finished" podID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerID="c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6" exitCode=0 Oct 03 09:46:39 crc kubenswrapper[4990]: I1003 09:46:39.891535 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rgvf" event={"ID":"0462ac94-e1a1-456b-89c7-a3a19eb8b80c","Type":"ContainerDied","Data":"c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6"} Oct 03 09:46:39 crc kubenswrapper[4990]: I1003 09:46:39.891588 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rgvf" event={"ID":"0462ac94-e1a1-456b-89c7-a3a19eb8b80c","Type":"ContainerStarted","Data":"ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d"} Oct 03 09:46:39 crc kubenswrapper[4990]: I1003 09:46:39.918067 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4rgvf" podStartSLOduration=2.857139867 podStartE2EDuration="55.918039979s" podCreationTimestamp="2025-10-03 09:45:44 +0000 UTC" firstStartedPulling="2025-10-03 09:45:46.308589719 +0000 UTC m=+128.105221566" lastFinishedPulling="2025-10-03 09:46:39.369489821 +0000 UTC m=+181.166121678" observedRunningTime="2025-10-03 09:46:39.914073635 +0000 UTC m=+181.710705492" watchObservedRunningTime="2025-10-03 09:46:39.918039979 +0000 UTC m=+181.714671836" Oct 03 09:46:43 crc kubenswrapper[4990]: I1003 09:46:43.932103 4990 generic.go:334] "Generic (PLEG): container finished" podID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerID="1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0" exitCode=0 Oct 03 09:46:43 crc kubenswrapper[4990]: I1003 09:46:43.932234 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsxcm" event={"ID":"fa3df226-ebca-4e5a-b166-778b6c69b247","Type":"ContainerDied","Data":"1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0"} Oct 03 09:46:43 crc kubenswrapper[4990]: I1003 09:46:43.938028 4990 generic.go:334] "Generic (PLEG): container finished" podID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerID="35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6" exitCode=0 Oct 03 09:46:43 crc kubenswrapper[4990]: I1003 09:46:43.938073 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfgbn" event={"ID":"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a","Type":"ContainerDied","Data":"35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6"} Oct 03 09:46:44 crc kubenswrapper[4990]: I1003 09:46:44.889228 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:46:44 crc kubenswrapper[4990]: I1003 09:46:44.889758 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:46:44 crc kubenswrapper[4990]: I1003 09:46:44.947818 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsxcm" event={"ID":"fa3df226-ebca-4e5a-b166-778b6c69b247","Type":"ContainerStarted","Data":"aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0"} Oct 03 09:46:44 crc kubenswrapper[4990]: I1003 09:46:44.949788 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:46:44 crc kubenswrapper[4990]: I1003 09:46:44.972366 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qsxcm" podStartSLOduration=3.929087975 podStartE2EDuration="58.972344198s" podCreationTimestamp="2025-10-03 09:45:46 +0000 UTC" firstStartedPulling="2025-10-03 09:45:49.40380985 +0000 UTC m=+131.200441707" lastFinishedPulling="2025-10-03 09:46:44.447066073 +0000 UTC m=+186.243697930" observedRunningTime="2025-10-03 09:46:44.967813439 +0000 UTC m=+186.764445296" watchObservedRunningTime="2025-10-03 09:46:44.972344198 +0000 UTC m=+186.768976055" Oct 03 09:46:45 crc kubenswrapper[4990]: I1003 09:46:45.006632 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:46:47 crc kubenswrapper[4990]: I1003 09:46:47.309242 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:46:47 crc kubenswrapper[4990]: I1003 09:46:47.309815 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:46:47 crc kubenswrapper[4990]: I1003 09:46:47.366223 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:46:48 crc kubenswrapper[4990]: I1003 09:46:48.401183 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:46:48 crc kubenswrapper[4990]: I1003 09:46:48.463189 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:46:50 crc kubenswrapper[4990]: I1003 09:46:50.116129 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pfgr8"] Oct 03 09:46:50 crc kubenswrapper[4990]: I1003 09:46:50.117031 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pfgr8" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="registry-server" containerID="cri-o://6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef" gracePeriod=2 Oct 03 09:46:51 crc kubenswrapper[4990]: I1003 09:46:51.001147 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcp8t" event={"ID":"2fdc3ca0-0b83-4b64-bd5c-985658582ae3","Type":"ContainerStarted","Data":"2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932"} Oct 03 09:46:51 crc kubenswrapper[4990]: I1003 09:46:51.913638 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:46:51 crc kubenswrapper[4990]: I1003 09:46:51.948854 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-catalog-content\") pod \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " Oct 03 09:46:51 crc kubenswrapper[4990]: I1003 09:46:51.948925 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrzc2\" (UniqueName: \"kubernetes.io/projected/7455e5b9-3e7a-4d4b-bf71-2b5767373598-kube-api-access-qrzc2\") pod \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " Oct 03 09:46:51 crc kubenswrapper[4990]: I1003 09:46:51.949043 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-utilities\") pod \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\" (UID: \"7455e5b9-3e7a-4d4b-bf71-2b5767373598\") " Oct 03 09:46:51 crc kubenswrapper[4990]: I1003 09:46:51.950144 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-utilities" (OuterVolumeSpecName: "utilities") pod "7455e5b9-3e7a-4d4b-bf71-2b5767373598" (UID: "7455e5b9-3e7a-4d4b-bf71-2b5767373598"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:46:51 crc kubenswrapper[4990]: I1003 09:46:51.961753 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7455e5b9-3e7a-4d4b-bf71-2b5767373598-kube-api-access-qrzc2" (OuterVolumeSpecName: "kube-api-access-qrzc2") pod "7455e5b9-3e7a-4d4b-bf71-2b5767373598" (UID: "7455e5b9-3e7a-4d4b-bf71-2b5767373598"). InnerVolumeSpecName "kube-api-access-qrzc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.012129 4990 generic.go:334] "Generic (PLEG): container finished" podID="5f863597-24b7-4013-b78f-daee3c406a29" containerID="9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b" exitCode=0 Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.012236 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkwwz" event={"ID":"5f863597-24b7-4013-b78f-daee3c406a29","Type":"ContainerDied","Data":"9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b"} Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.017939 4990 generic.go:334] "Generic (PLEG): container finished" podID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerID="6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef" exitCode=0 Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.018116 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfgr8" event={"ID":"7455e5b9-3e7a-4d4b-bf71-2b5767373598","Type":"ContainerDied","Data":"6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef"} Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.018529 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pfgr8" event={"ID":"7455e5b9-3e7a-4d4b-bf71-2b5767373598","Type":"ContainerDied","Data":"eb504bdf99d2963c37c271b07bc65f4ee9dea009e60471fde2e750b1a162f25b"} Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.018557 4990 scope.go:117] "RemoveContainer" containerID="6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.018224 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pfgr8" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.021671 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfgbn" event={"ID":"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a","Type":"ContainerStarted","Data":"f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a"} Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.024447 4990 generic.go:334] "Generic (PLEG): container finished" podID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerID="2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932" exitCode=0 Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.024504 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcp8t" event={"ID":"2fdc3ca0-0b83-4b64-bd5c-985658582ae3","Type":"ContainerDied","Data":"2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932"} Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.031047 4990 generic.go:334] "Generic (PLEG): container finished" podID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerID="12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441" exitCode=0 Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.031155 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99zt5" event={"ID":"75f571ee-03df-47ad-9c14-7b61ec0ed691","Type":"ContainerDied","Data":"12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441"} Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.037170 4990 generic.go:334] "Generic (PLEG): container finished" podID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerID="96556867a5cea0594310358c057bd26f867c2032539ece6dbe5db7b0abf1fb2f" exitCode=0 Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.037222 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7297n" event={"ID":"c25aa9b7-d693-4efb-80c0-28c7d44f4c59","Type":"ContainerDied","Data":"96556867a5cea0594310358c057bd26f867c2032539ece6dbe5db7b0abf1fb2f"} Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.042936 4990 scope.go:117] "RemoveContainer" containerID="3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.052844 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.052893 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrzc2\" (UniqueName: \"kubernetes.io/projected/7455e5b9-3e7a-4d4b-bf71-2b5767373598-kube-api-access-qrzc2\") on node \"crc\" DevicePath \"\"" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.080860 4990 scope.go:117] "RemoveContainer" containerID="aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.084246 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lfgbn" podStartSLOduration=8.013499416 podStartE2EDuration="1m8.084211752s" podCreationTimestamp="2025-10-03 09:45:44 +0000 UTC" firstStartedPulling="2025-10-03 09:45:46.292958981 +0000 UTC m=+128.089590838" lastFinishedPulling="2025-10-03 09:46:46.363671277 +0000 UTC m=+188.160303174" observedRunningTime="2025-10-03 09:46:52.052211889 +0000 UTC m=+193.848843736" watchObservedRunningTime="2025-10-03 09:46:52.084211752 +0000 UTC m=+193.880843609" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.114069 4990 scope.go:117] "RemoveContainer" containerID="6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef" Oct 03 09:46:52 crc kubenswrapper[4990]: E1003 09:46:52.114760 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef\": container with ID starting with 6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef not found: ID does not exist" containerID="6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.114836 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef"} err="failed to get container status \"6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef\": rpc error: code = NotFound desc = could not find container \"6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef\": container with ID starting with 6464e812ef350b4329999e26aefda60dea12a4d73b076fae708da4d42ab4c3ef not found: ID does not exist" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.114989 4990 scope.go:117] "RemoveContainer" containerID="3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658" Oct 03 09:46:52 crc kubenswrapper[4990]: E1003 09:46:52.115654 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658\": container with ID starting with 3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658 not found: ID does not exist" containerID="3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.115696 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658"} err="failed to get container status \"3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658\": rpc error: code = NotFound desc = could not find container \"3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658\": container with ID starting with 3dc0fcfa688da6783f78d09d57b302327985db239f72172331cfc2f79f117658 not found: ID does not exist" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.115722 4990 scope.go:117] "RemoveContainer" containerID="aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642" Oct 03 09:46:52 crc kubenswrapper[4990]: E1003 09:46:52.116183 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642\": container with ID starting with aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642 not found: ID does not exist" containerID="aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642" Oct 03 09:46:52 crc kubenswrapper[4990]: I1003 09:46:52.116213 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642"} err="failed to get container status \"aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642\": rpc error: code = NotFound desc = could not find container \"aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642\": container with ID starting with aa231391424ce04dcab76489748a7344e8065ed03999dd39824872259344a642 not found: ID does not exist" Oct 03 09:46:53 crc kubenswrapper[4990]: I1003 09:46:53.146081 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7455e5b9-3e7a-4d4b-bf71-2b5767373598" (UID: "7455e5b9-3e7a-4d4b-bf71-2b5767373598"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:46:53 crc kubenswrapper[4990]: I1003 09:46:53.166161 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7455e5b9-3e7a-4d4b-bf71-2b5767373598-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:46:53 crc kubenswrapper[4990]: I1003 09:46:53.253931 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pfgr8"] Oct 03 09:46:53 crc kubenswrapper[4990]: I1003 09:46:53.270148 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pfgr8"] Oct 03 09:46:54 crc kubenswrapper[4990]: I1003 09:46:54.692577 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:46:54 crc kubenswrapper[4990]: I1003 09:46:54.693066 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:46:54 crc kubenswrapper[4990]: I1003 09:46:54.742312 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:46:54 crc kubenswrapper[4990]: I1003 09:46:54.898424 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" path="/var/lib/kubelet/pods/7455e5b9-3e7a-4d4b-bf71-2b5767373598/volumes" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.055447 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99zt5" event={"ID":"75f571ee-03df-47ad-9c14-7b61ec0ed691","Type":"ContainerStarted","Data":"be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d"} Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.059149 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7297n" event={"ID":"c25aa9b7-d693-4efb-80c0-28c7d44f4c59","Type":"ContainerStarted","Data":"2fe0ae2a86589ea13c2092888b5afb5870ec708ae6ad4751775207af02033772"} Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.061689 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkwwz" event={"ID":"5f863597-24b7-4013-b78f-daee3c406a29","Type":"ContainerStarted","Data":"80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee"} Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.063866 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcp8t" event={"ID":"2fdc3ca0-0b83-4b64-bd5c-985658582ae3","Type":"ContainerStarted","Data":"f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227"} Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.091676 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7297n" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.093336 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7297n" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.106614 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-99zt5" podStartSLOduration=2.739921986 podStartE2EDuration="1m9.10659032s" podCreationTimestamp="2025-10-03 09:45:46 +0000 UTC" firstStartedPulling="2025-10-03 09:45:48.367772311 +0000 UTC m=+130.164404168" lastFinishedPulling="2025-10-03 09:46:54.734440645 +0000 UTC m=+196.531072502" observedRunningTime="2025-10-03 09:46:55.084142219 +0000 UTC m=+196.880774076" watchObservedRunningTime="2025-10-03 09:46:55.10659032 +0000 UTC m=+196.903222177" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.108257 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rkwwz" podStartSLOduration=2.638028446 podStartE2EDuration="1m11.108247544s" podCreationTimestamp="2025-10-03 09:45:44 +0000 UTC" firstStartedPulling="2025-10-03 09:45:46.287052968 +0000 UTC m=+128.083684825" lastFinishedPulling="2025-10-03 09:46:54.757272066 +0000 UTC m=+196.553903923" observedRunningTime="2025-10-03 09:46:55.105111721 +0000 UTC m=+196.901743588" watchObservedRunningTime="2025-10-03 09:46:55.108247544 +0000 UTC m=+196.904879401" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.125351 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7297n" podStartSLOduration=2.715947982 podStartE2EDuration="1m11.125325293s" podCreationTimestamp="2025-10-03 09:45:44 +0000 UTC" firstStartedPulling="2025-10-03 09:45:46.300863717 +0000 UTC m=+128.097495574" lastFinishedPulling="2025-10-03 09:46:54.710241038 +0000 UTC m=+196.506872885" observedRunningTime="2025-10-03 09:46:55.120738933 +0000 UTC m=+196.917370800" watchObservedRunningTime="2025-10-03 09:46:55.125325293 +0000 UTC m=+196.921957150" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.142595 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jcp8t" podStartSLOduration=2.883544572 podStartE2EDuration="1m8.142573197s" podCreationTimestamp="2025-10-03 09:45:47 +0000 UTC" firstStartedPulling="2025-10-03 09:45:49.421643725 +0000 UTC m=+131.218275582" lastFinishedPulling="2025-10-03 09:46:54.68067234 +0000 UTC m=+196.477304207" observedRunningTime="2025-10-03 09:46:55.139969919 +0000 UTC m=+196.936601776" watchObservedRunningTime="2025-10-03 09:46:55.142573197 +0000 UTC m=+196.939205054" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.304002 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.304085 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.332330 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:46:55 crc kubenswrapper[4990]: I1003 09:46:55.332410 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:46:56 crc kubenswrapper[4990]: I1003 09:46:56.143838 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7297n" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="registry-server" probeResult="failure" output=< Oct 03 09:46:56 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 09:46:56 crc kubenswrapper[4990]: > Oct 03 09:46:56 crc kubenswrapper[4990]: I1003 09:46:56.375045 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rkwwz" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="registry-server" probeResult="failure" output=< Oct 03 09:46:56 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 09:46:56 crc kubenswrapper[4990]: > Oct 03 09:46:56 crc kubenswrapper[4990]: I1003 09:46:56.892976 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:46:56 crc kubenswrapper[4990]: I1003 09:46:56.893487 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:46:56 crc kubenswrapper[4990]: I1003 09:46:56.945894 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:46:57 crc kubenswrapper[4990]: I1003 09:46:57.355268 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:46:57 crc kubenswrapper[4990]: I1003 09:46:57.960659 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:46:57 crc kubenswrapper[4990]: I1003 09:46:57.960788 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:46:59 crc kubenswrapper[4990]: I1003 09:46:59.001304 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jcp8t" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="registry-server" probeResult="failure" output=< Oct 03 09:46:59 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 09:46:59 crc kubenswrapper[4990]: > Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.312541 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsxcm"] Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.316742 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qsxcm" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerName="registry-server" containerID="cri-o://aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0" gracePeriod=2 Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.650872 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.778235 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-catalog-content\") pod \"fa3df226-ebca-4e5a-b166-778b6c69b247\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.778312 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-utilities\") pod \"fa3df226-ebca-4e5a-b166-778b6c69b247\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.778377 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6m2n\" (UniqueName: \"kubernetes.io/projected/fa3df226-ebca-4e5a-b166-778b6c69b247-kube-api-access-q6m2n\") pod \"fa3df226-ebca-4e5a-b166-778b6c69b247\" (UID: \"fa3df226-ebca-4e5a-b166-778b6c69b247\") " Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.779368 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-utilities" (OuterVolumeSpecName: "utilities") pod "fa3df226-ebca-4e5a-b166-778b6c69b247" (UID: "fa3df226-ebca-4e5a-b166-778b6c69b247"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.787538 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3df226-ebca-4e5a-b166-778b6c69b247-kube-api-access-q6m2n" (OuterVolumeSpecName: "kube-api-access-q6m2n") pod "fa3df226-ebca-4e5a-b166-778b6c69b247" (UID: "fa3df226-ebca-4e5a-b166-778b6c69b247"). InnerVolumeSpecName "kube-api-access-q6m2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.794808 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa3df226-ebca-4e5a-b166-778b6c69b247" (UID: "fa3df226-ebca-4e5a-b166-778b6c69b247"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.879413 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.879454 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3df226-ebca-4e5a-b166-778b6c69b247-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:00 crc kubenswrapper[4990]: I1003 09:47:00.879466 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6m2n\" (UniqueName: \"kubernetes.io/projected/fa3df226-ebca-4e5a-b166-778b6c69b247-kube-api-access-q6m2n\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:00 crc kubenswrapper[4990]: E1003 09:47:00.925367 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa3df226_ebca_4e5a_b166_778b6c69b247.slice\": RecentStats: unable to find data in memory cache]" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.099181 4990 generic.go:334] "Generic (PLEG): container finished" podID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerID="aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0" exitCode=0 Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.099247 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsxcm" event={"ID":"fa3df226-ebca-4e5a-b166-778b6c69b247","Type":"ContainerDied","Data":"aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0"} Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.099278 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsxcm" event={"ID":"fa3df226-ebca-4e5a-b166-778b6c69b247","Type":"ContainerDied","Data":"74616c7116f104fe4b9af1e027745f459d35fc9a85a33b314f68314f56de4034"} Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.099313 4990 scope.go:117] "RemoveContainer" containerID="aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.099473 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qsxcm" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.120987 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsxcm"] Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.126750 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsxcm"] Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.127382 4990 scope.go:117] "RemoveContainer" containerID="1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.147844 4990 scope.go:117] "RemoveContainer" containerID="c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.169065 4990 scope.go:117] "RemoveContainer" containerID="aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0" Oct 03 09:47:01 crc kubenswrapper[4990]: E1003 09:47:01.169795 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0\": container with ID starting with aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0 not found: ID does not exist" containerID="aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.169861 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0"} err="failed to get container status \"aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0\": rpc error: code = NotFound desc = could not find container \"aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0\": container with ID starting with aada2f1d22478194f74c76f77416fa8b646a65a308d4ac9a9b4592f46cf4f0b0 not found: ID does not exist" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.169908 4990 scope.go:117] "RemoveContainer" containerID="1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0" Oct 03 09:47:01 crc kubenswrapper[4990]: E1003 09:47:01.206197 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0\": container with ID starting with 1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0 not found: ID does not exist" containerID="1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.206712 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0"} err="failed to get container status \"1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0\": rpc error: code = NotFound desc = could not find container \"1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0\": container with ID starting with 1c224e40a238b8327717e23b3c16039951a78fa782ee6ada29d36b870285eae0 not found: ID does not exist" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.206760 4990 scope.go:117] "RemoveContainer" containerID="c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620" Oct 03 09:47:01 crc kubenswrapper[4990]: E1003 09:47:01.208764 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620\": container with ID starting with c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620 not found: ID does not exist" containerID="c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620" Oct 03 09:47:01 crc kubenswrapper[4990]: I1003 09:47:01.208828 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620"} err="failed to get container status \"c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620\": rpc error: code = NotFound desc = could not find container \"c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620\": container with ID starting with c3b51fd4fe30a2f9ba3e27a7f77aa8c521c08891020eba1c812541e35c599620 not found: ID does not exist" Oct 03 09:47:02 crc kubenswrapper[4990]: I1003 09:47:02.879262 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" path="/var/lib/kubelet/pods/fa3df226-ebca-4e5a-b166-778b6c69b247/volumes" Oct 03 09:47:04 crc kubenswrapper[4990]: I1003 09:47:04.752900 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:47:05 crc kubenswrapper[4990]: I1003 09:47:05.133574 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7297n" Oct 03 09:47:05 crc kubenswrapper[4990]: I1003 09:47:05.184280 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7297n" Oct 03 09:47:05 crc kubenswrapper[4990]: I1003 09:47:05.379138 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:47:05 crc kubenswrapper[4990]: I1003 09:47:05.420441 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:47:06 crc kubenswrapper[4990]: I1003 09:47:06.711476 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rkwwz"] Oct 03 09:47:06 crc kubenswrapper[4990]: I1003 09:47:06.945922 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.136129 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rkwwz" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="registry-server" containerID="cri-o://80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee" gracePeriod=2 Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.523080 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.616701 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnxrx\" (UniqueName: \"kubernetes.io/projected/5f863597-24b7-4013-b78f-daee3c406a29-kube-api-access-tnxrx\") pod \"5f863597-24b7-4013-b78f-daee3c406a29\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.616781 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-utilities\") pod \"5f863597-24b7-4013-b78f-daee3c406a29\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.616820 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-catalog-content\") pod \"5f863597-24b7-4013-b78f-daee3c406a29\" (UID: \"5f863597-24b7-4013-b78f-daee3c406a29\") " Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.618030 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-utilities" (OuterVolumeSpecName: "utilities") pod "5f863597-24b7-4013-b78f-daee3c406a29" (UID: "5f863597-24b7-4013-b78f-daee3c406a29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.618298 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.628694 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f863597-24b7-4013-b78f-daee3c406a29-kube-api-access-tnxrx" (OuterVolumeSpecName: "kube-api-access-tnxrx") pod "5f863597-24b7-4013-b78f-daee3c406a29" (UID: "5f863597-24b7-4013-b78f-daee3c406a29"). InnerVolumeSpecName "kube-api-access-tnxrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.662935 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f863597-24b7-4013-b78f-daee3c406a29" (UID: "5f863597-24b7-4013-b78f-daee3c406a29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.719766 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnxrx\" (UniqueName: \"kubernetes.io/projected/5f863597-24b7-4013-b78f-daee3c406a29-kube-api-access-tnxrx\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:07 crc kubenswrapper[4990]: I1003 09:47:07.719807 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f863597-24b7-4013-b78f-daee3c406a29-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.019700 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.073574 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.143411 4990 generic.go:334] "Generic (PLEG): container finished" podID="5f863597-24b7-4013-b78f-daee3c406a29" containerID="80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee" exitCode=0 Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.144070 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkwwz" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.147680 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkwwz" event={"ID":"5f863597-24b7-4013-b78f-daee3c406a29","Type":"ContainerDied","Data":"80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee"} Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.147728 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkwwz" event={"ID":"5f863597-24b7-4013-b78f-daee3c406a29","Type":"ContainerDied","Data":"bb3d97cfed9c46567a89ce3e00d271c2a909c2c3968b1b64127cf86af8a2454c"} Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.147748 4990 scope.go:117] "RemoveContainer" containerID="80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.171847 4990 scope.go:117] "RemoveContainer" containerID="9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.188690 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rkwwz"] Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.193202 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rkwwz"] Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.194415 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.209023 4990 scope.go:117] "RemoveContainer" containerID="90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.236008 4990 scope.go:117] "RemoveContainer" containerID="80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee" Oct 03 09:47:08 crc kubenswrapper[4990]: E1003 09:47:08.238731 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee\": container with ID starting with 80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee not found: ID does not exist" containerID="80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.238781 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee"} err="failed to get container status \"80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee\": rpc error: code = NotFound desc = could not find container \"80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee\": container with ID starting with 80cc846e6bb620c2b8291c715eb9d46913720920c57bffb28a1faf2406d16dee not found: ID does not exist" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.238814 4990 scope.go:117] "RemoveContainer" containerID="9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b" Oct 03 09:47:08 crc kubenswrapper[4990]: E1003 09:47:08.239129 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b\": container with ID starting with 9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b not found: ID does not exist" containerID="9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.239162 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b"} err="failed to get container status \"9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b\": rpc error: code = NotFound desc = could not find container \"9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b\": container with ID starting with 9735883e181dc91b14231a72772efdc350b33895a01ad3ee2d63711ff1dbb91b not found: ID does not exist" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.239181 4990 scope.go:117] "RemoveContainer" containerID="90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009" Oct 03 09:47:08 crc kubenswrapper[4990]: E1003 09:47:08.239439 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009\": container with ID starting with 90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009 not found: ID does not exist" containerID="90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.239460 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009"} err="failed to get container status \"90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009\": rpc error: code = NotFound desc = could not find container \"90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009\": container with ID starting with 90cc63011c553c468d4ef1013ef319e529c4ed1f598e0967b29c288cfe1fa009 not found: ID does not exist" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.880119 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f863597-24b7-4013-b78f-daee3c406a29" path="/var/lib/kubelet/pods/5f863597-24b7-4013-b78f-daee3c406a29/volumes" Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.912372 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7297n"] Oct 03 09:47:08 crc kubenswrapper[4990]: I1003 09:47:08.912745 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7297n" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="registry-server" containerID="cri-o://2fe0ae2a86589ea13c2092888b5afb5870ec708ae6ad4751775207af02033772" gracePeriod=2 Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.041664 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wnxdn"] Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.154150 4990 generic.go:334] "Generic (PLEG): container finished" podID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerID="2fe0ae2a86589ea13c2092888b5afb5870ec708ae6ad4751775207af02033772" exitCode=0 Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.154255 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7297n" event={"ID":"c25aa9b7-d693-4efb-80c0-28c7d44f4c59","Type":"ContainerDied","Data":"2fe0ae2a86589ea13c2092888b5afb5870ec708ae6ad4751775207af02033772"} Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.323229 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7297n" Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.353754 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-utilities\") pod \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.353849 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9fns\" (UniqueName: \"kubernetes.io/projected/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-kube-api-access-r9fns\") pod \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.353920 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-catalog-content\") pod \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\" (UID: \"c25aa9b7-d693-4efb-80c0-28c7d44f4c59\") " Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.354815 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-utilities" (OuterVolumeSpecName: "utilities") pod "c25aa9b7-d693-4efb-80c0-28c7d44f4c59" (UID: "c25aa9b7-d693-4efb-80c0-28c7d44f4c59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.361863 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-kube-api-access-r9fns" (OuterVolumeSpecName: "kube-api-access-r9fns") pod "c25aa9b7-d693-4efb-80c0-28c7d44f4c59" (UID: "c25aa9b7-d693-4efb-80c0-28c7d44f4c59"). InnerVolumeSpecName "kube-api-access-r9fns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.402532 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c25aa9b7-d693-4efb-80c0-28c7d44f4c59" (UID: "c25aa9b7-d693-4efb-80c0-28c7d44f4c59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.455955 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.455999 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9fns\" (UniqueName: \"kubernetes.io/projected/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-kube-api-access-r9fns\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:09 crc kubenswrapper[4990]: I1003 09:47:09.456012 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c25aa9b7-d693-4efb-80c0-28c7d44f4c59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:10 crc kubenswrapper[4990]: I1003 09:47:10.166233 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7297n" event={"ID":"c25aa9b7-d693-4efb-80c0-28c7d44f4c59","Type":"ContainerDied","Data":"7a613309d95e2daf57e8845de737161bfe24776f025ccd0e310e9b6a15d6d810"} Oct 03 09:47:10 crc kubenswrapper[4990]: I1003 09:47:10.166320 4990 scope.go:117] "RemoveContainer" containerID="2fe0ae2a86589ea13c2092888b5afb5870ec708ae6ad4751775207af02033772" Oct 03 09:47:10 crc kubenswrapper[4990]: I1003 09:47:10.166328 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7297n" Oct 03 09:47:10 crc kubenswrapper[4990]: I1003 09:47:10.181909 4990 scope.go:117] "RemoveContainer" containerID="96556867a5cea0594310358c057bd26f867c2032539ece6dbe5db7b0abf1fb2f" Oct 03 09:47:10 crc kubenswrapper[4990]: I1003 09:47:10.197305 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7297n"] Oct 03 09:47:10 crc kubenswrapper[4990]: I1003 09:47:10.199455 4990 scope.go:117] "RemoveContainer" containerID="394dffff0a3e8b18af0c41817fa2c5a5ad728562fece6facda0b17558a1c9e02" Oct 03 09:47:10 crc kubenswrapper[4990]: I1003 09:47:10.200951 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7297n"] Oct 03 09:47:10 crc kubenswrapper[4990]: I1003 09:47:10.879648 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" path="/var/lib/kubelet/pods/c25aa9b7-d693-4efb-80c0-28c7d44f4c59/volumes" Oct 03 09:47:25 crc kubenswrapper[4990]: I1003 09:47:25.303956 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:47:25 crc kubenswrapper[4990]: I1003 09:47:25.304912 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:47:25 crc kubenswrapper[4990]: I1003 09:47:25.304992 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:47:25 crc kubenswrapper[4990]: I1003 09:47:25.305885 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:47:25 crc kubenswrapper[4990]: I1003 09:47:25.305954 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75" gracePeriod=600 Oct 03 09:47:26 crc kubenswrapper[4990]: I1003 09:47:26.258853 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75" exitCode=0 Oct 03 09:47:26 crc kubenswrapper[4990]: I1003 09:47:26.259027 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75"} Oct 03 09:47:26 crc kubenswrapper[4990]: I1003 09:47:26.259313 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"593e902b7ac753b682276d32de4451bd59f20dd111b69f74af7ec96904872b2e"} Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.093060 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" podUID="cd1a02a7-9b7a-417f-a2b7-7421705a3010" containerName="oauth-openshift" containerID="cri-o://8c818eaef9847549849faf7b8076d9593ca5c83807568baafdd3e8d601b05c2b" gracePeriod=15 Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.319476 4990 generic.go:334] "Generic (PLEG): container finished" podID="cd1a02a7-9b7a-417f-a2b7-7421705a3010" containerID="8c818eaef9847549849faf7b8076d9593ca5c83807568baafdd3e8d601b05c2b" exitCode=0 Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.319553 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" event={"ID":"cd1a02a7-9b7a-417f-a2b7-7421705a3010","Type":"ContainerDied","Data":"8c818eaef9847549849faf7b8076d9593ca5c83807568baafdd3e8d601b05c2b"} Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.453282 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.487497 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-9jtrw"] Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488011 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="extract-utilities" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488035 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="extract-utilities" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488049 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerName="extract-utilities" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488057 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerName="extract-utilities" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488088 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="extract-utilities" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488099 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="extract-utilities" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488111 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerName="extract-content" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488120 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerName="extract-content" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488134 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="extract-content" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488142 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="extract-content" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488150 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd039c2-d54b-4009-b0af-833dd18733c0" containerName="pruner" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488157 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd039c2-d54b-4009-b0af-833dd18733c0" containerName="pruner" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488166 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488173 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488182 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488190 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488202 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="extract-content" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488209 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="extract-content" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488222 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488231 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488240 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd1a02a7-9b7a-417f-a2b7-7421705a3010" containerName="oauth-openshift" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488247 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1a02a7-9b7a-417f-a2b7-7421705a3010" containerName="oauth-openshift" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488258 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="extract-content" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488266 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="extract-content" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488280 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="extract-utilities" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488287 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="extract-utilities" Oct 03 09:47:34 crc kubenswrapper[4990]: E1003 09:47:34.488298 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488309 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488420 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f863597-24b7-4013-b78f-daee3c406a29" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488436 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3df226-ebca-4e5a-b166-778b6c69b247" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488448 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7455e5b9-3e7a-4d4b-bf71-2b5767373598" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488459 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd1a02a7-9b7a-417f-a2b7-7421705a3010" containerName="oauth-openshift" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488471 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd039c2-d54b-4009-b0af-833dd18733c0" containerName="pruner" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488480 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25aa9b7-d693-4efb-80c0-28c7d44f4c59" containerName="registry-server" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.488988 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.497262 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-9jtrw"] Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.543656 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-error\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.543747 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-cliconfig\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.543825 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-ocp-branding-template\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.543899 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-serving-cert\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.543931 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-trusted-ca-bundle\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.543969 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-idp-0-file-data\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.544015 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-login\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.544048 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-provider-selection\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.544084 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-policies\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.544127 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-service-ca\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.544162 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-router-certs\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.544188 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-dir\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.544261 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjlf8\" (UniqueName: \"kubernetes.io/projected/cd1a02a7-9b7a-417f-a2b7-7421705a3010-kube-api-access-jjlf8\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.544284 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-session\") pod \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\" (UID: \"cd1a02a7-9b7a-417f-a2b7-7421705a3010\") " Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.545238 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.545304 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.545900 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.546074 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.546395 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.552238 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.552627 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.552899 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.553239 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1a02a7-9b7a-417f-a2b7-7421705a3010-kube-api-access-jjlf8" (OuterVolumeSpecName: "kube-api-access-jjlf8") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "kube-api-access-jjlf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.553749 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.554021 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.554893 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.555093 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.555222 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cd1a02a7-9b7a-417f-a2b7-7421705a3010" (UID: "cd1a02a7-9b7a-417f-a2b7-7421705a3010"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.646104 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.646485 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.646709 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.646806 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh9tn\" (UniqueName: \"kubernetes.io/projected/fe6af191-ef08-4a07-8254-e9d09be3c08c-kube-api-access-jh9tn\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.646850 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.646889 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.646935 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647014 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647055 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647089 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-audit-policies\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647215 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647287 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647387 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe6af191-ef08-4a07-8254-e9d09be3c08c-audit-dir\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647409 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647538 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647558 4990 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647570 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjlf8\" (UniqueName: \"kubernetes.io/projected/cd1a02a7-9b7a-417f-a2b7-7421705a3010-kube-api-access-jjlf8\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647581 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647593 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647604 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647617 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647629 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647640 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647650 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647662 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647673 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647683 4990 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.647693 4990 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd1a02a7-9b7a-417f-a2b7-7421705a3010-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749231 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749294 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749324 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749369 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh9tn\" (UniqueName: \"kubernetes.io/projected/fe6af191-ef08-4a07-8254-e9d09be3c08c-kube-api-access-jh9tn\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749397 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749423 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749459 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749486 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749549 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749584 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-audit-policies\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749626 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749663 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749703 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe6af191-ef08-4a07-8254-e9d09be3c08c-audit-dir\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.749732 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.750681 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe6af191-ef08-4a07-8254-e9d09be3c08c-audit-dir\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.750847 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-audit-policies\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.750976 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.751393 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.753506 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.753638 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.753657 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-error\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.753761 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.754603 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.754945 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-template-login\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.755694 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.755690 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.757923 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe6af191-ef08-4a07-8254-e9d09be3c08c-v4-0-config-system-session\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.773650 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh9tn\" (UniqueName: \"kubernetes.io/projected/fe6af191-ef08-4a07-8254-e9d09be3c08c-kube-api-access-jh9tn\") pod \"oauth-openshift-79d78bfd77-9jtrw\" (UID: \"fe6af191-ef08-4a07-8254-e9d09be3c08c\") " pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:34 crc kubenswrapper[4990]: I1003 09:47:34.816059 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:35 crc kubenswrapper[4990]: I1003 09:47:35.260328 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d78bfd77-9jtrw"] Oct 03 09:47:35 crc kubenswrapper[4990]: I1003 09:47:35.328532 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" event={"ID":"fe6af191-ef08-4a07-8254-e9d09be3c08c","Type":"ContainerStarted","Data":"ba5c271b687eb74993e061b2863267a7c9767ce61b64d81c66f2575a7559dc60"} Oct 03 09:47:35 crc kubenswrapper[4990]: I1003 09:47:35.330230 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" event={"ID":"cd1a02a7-9b7a-417f-a2b7-7421705a3010","Type":"ContainerDied","Data":"3c352f16110b7055b4cb4b9f88c5322946fe8fe57930b892de070fc7e816b869"} Oct 03 09:47:35 crc kubenswrapper[4990]: I1003 09:47:35.330279 4990 scope.go:117] "RemoveContainer" containerID="8c818eaef9847549849faf7b8076d9593ca5c83807568baafdd3e8d601b05c2b" Oct 03 09:47:35 crc kubenswrapper[4990]: I1003 09:47:35.330440 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wnxdn" Oct 03 09:47:35 crc kubenswrapper[4990]: I1003 09:47:35.362692 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wnxdn"] Oct 03 09:47:35 crc kubenswrapper[4990]: I1003 09:47:35.365166 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wnxdn"] Oct 03 09:47:36 crc kubenswrapper[4990]: I1003 09:47:36.335406 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" event={"ID":"fe6af191-ef08-4a07-8254-e9d09be3c08c","Type":"ContainerStarted","Data":"bd3a8441383ecb75179fb597eb7948106d31670b028f2d6e6894979c589daaee"} Oct 03 09:47:36 crc kubenswrapper[4990]: I1003 09:47:36.335807 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:36 crc kubenswrapper[4990]: I1003 09:47:36.346810 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" Oct 03 09:47:36 crc kubenswrapper[4990]: I1003 09:47:36.356792 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79d78bfd77-9jtrw" podStartSLOduration=27.356768517 podStartE2EDuration="27.356768517s" podCreationTimestamp="2025-10-03 09:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:47:36.355685157 +0000 UTC m=+238.152317014" watchObservedRunningTime="2025-10-03 09:47:36.356768517 +0000 UTC m=+238.153400374" Oct 03 09:47:36 crc kubenswrapper[4990]: I1003 09:47:36.884646 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd1a02a7-9b7a-417f-a2b7-7421705a3010" path="/var/lib/kubelet/pods/cd1a02a7-9b7a-417f-a2b7-7421705a3010/volumes" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.476290 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rgvf"] Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.477648 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4rgvf" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerName="registry-server" containerID="cri-o://ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d" gracePeriod=30 Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.496007 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfgbn"] Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.496786 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lfgbn" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerName="registry-server" containerID="cri-o://f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a" gracePeriod=30 Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.509957 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-prdsm"] Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.510335 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" podUID="0c77dc95-2e7f-47c7-821d-c94a13f18f32" containerName="marketplace-operator" containerID="cri-o://f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e" gracePeriod=30 Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.514972 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99zt5"] Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.515371 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-99zt5" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="registry-server" containerID="cri-o://be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d" gracePeriod=30 Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.526578 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jx4gz"] Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.527766 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.540977 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jcp8t"] Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.541335 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jcp8t" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="registry-server" containerID="cri-o://f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227" gracePeriod=30 Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.546645 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jx4gz"] Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.712405 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f649d78-ed82-4728-99ff-cf70e4d53756-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.712488 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3f649d78-ed82-4728-99ff-cf70e4d53756-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.712541 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np9s7\" (UniqueName: \"kubernetes.io/projected/3f649d78-ed82-4728-99ff-cf70e4d53756-kube-api-access-np9s7\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.813576 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f649d78-ed82-4728-99ff-cf70e4d53756-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.813664 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3f649d78-ed82-4728-99ff-cf70e4d53756-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.813703 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np9s7\" (UniqueName: \"kubernetes.io/projected/3f649d78-ed82-4728-99ff-cf70e4d53756-kube-api-access-np9s7\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.814835 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f649d78-ed82-4728-99ff-cf70e4d53756-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.833226 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3f649d78-ed82-4728-99ff-cf70e4d53756-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.835338 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np9s7\" (UniqueName: \"kubernetes.io/projected/3f649d78-ed82-4728-99ff-cf70e4d53756-kube-api-access-np9s7\") pod \"marketplace-operator-79b997595-jx4gz\" (UID: \"3f649d78-ed82-4728-99ff-cf70e4d53756\") " pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: E1003 09:48:06.883832 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d is running failed: container process not found" containerID="be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d" cmd=["grpc_health_probe","-addr=:50051"] Oct 03 09:48:06 crc kubenswrapper[4990]: E1003 09:48:06.884610 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d is running failed: container process not found" containerID="be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d" cmd=["grpc_health_probe","-addr=:50051"] Oct 03 09:48:06 crc kubenswrapper[4990]: E1003 09:48:06.884843 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d is running failed: container process not found" containerID="be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d" cmd=["grpc_health_probe","-addr=:50051"] Oct 03 09:48:06 crc kubenswrapper[4990]: E1003 09:48:06.884878 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-99zt5" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="registry-server" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.940931 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.950543 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.956590 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.967059 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.984757 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:48:06 crc kubenswrapper[4990]: I1003 09:48:06.985406 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119264 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-utilities\") pod \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119317 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-catalog-content\") pod \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119343 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-utilities\") pod \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119360 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-catalog-content\") pod \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119391 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-catalog-content\") pod \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119425 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-trusted-ca\") pod \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119469 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-utilities\") pod \"75f571ee-03df-47ad-9c14-7b61ec0ed691\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119491 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl9kz\" (UniqueName: \"kubernetes.io/projected/75f571ee-03df-47ad-9c14-7b61ec0ed691-kube-api-access-zl9kz\") pod \"75f571ee-03df-47ad-9c14-7b61ec0ed691\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119530 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-operator-metrics\") pod \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119589 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns7mh\" (UniqueName: \"kubernetes.io/projected/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-kube-api-access-ns7mh\") pod \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\" (UID: \"0462ac94-e1a1-456b-89c7-a3a19eb8b80c\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119628 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdxsl\" (UniqueName: \"kubernetes.io/projected/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-kube-api-access-jdxsl\") pod \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\" (UID: \"2fdc3ca0-0b83-4b64-bd5c-985658582ae3\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119665 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-catalog-content\") pod \"75f571ee-03df-47ad-9c14-7b61ec0ed691\" (UID: \"75f571ee-03df-47ad-9c14-7b61ec0ed691\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119685 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-utilities\") pod \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119708 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-852qj\" (UniqueName: \"kubernetes.io/projected/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-kube-api-access-852qj\") pod \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\" (UID: \"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.119732 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7wlh\" (UniqueName: \"kubernetes.io/projected/0c77dc95-2e7f-47c7-821d-c94a13f18f32-kube-api-access-b7wlh\") pod \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\" (UID: \"0c77dc95-2e7f-47c7-821d-c94a13f18f32\") " Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.121591 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-utilities" (OuterVolumeSpecName: "utilities") pod "2fdc3ca0-0b83-4b64-bd5c-985658582ae3" (UID: "2fdc3ca0-0b83-4b64-bd5c-985658582ae3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.124092 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-utilities" (OuterVolumeSpecName: "utilities") pod "7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" (UID: "7a3a86ea-aa6a-4dc4-8aed-794167e95b8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.124167 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0c77dc95-2e7f-47c7-821d-c94a13f18f32" (UID: "0c77dc95-2e7f-47c7-821d-c94a13f18f32"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.124830 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-utilities" (OuterVolumeSpecName: "utilities") pod "75f571ee-03df-47ad-9c14-7b61ec0ed691" (UID: "75f571ee-03df-47ad-9c14-7b61ec0ed691"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.126400 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-kube-api-access-ns7mh" (OuterVolumeSpecName: "kube-api-access-ns7mh") pod "0462ac94-e1a1-456b-89c7-a3a19eb8b80c" (UID: "0462ac94-e1a1-456b-89c7-a3a19eb8b80c"). InnerVolumeSpecName "kube-api-access-ns7mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.126708 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0c77dc95-2e7f-47c7-821d-c94a13f18f32" (UID: "0c77dc95-2e7f-47c7-821d-c94a13f18f32"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.127436 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-utilities" (OuterVolumeSpecName: "utilities") pod "0462ac94-e1a1-456b-89c7-a3a19eb8b80c" (UID: "0462ac94-e1a1-456b-89c7-a3a19eb8b80c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.127490 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f571ee-03df-47ad-9c14-7b61ec0ed691-kube-api-access-zl9kz" (OuterVolumeSpecName: "kube-api-access-zl9kz") pod "75f571ee-03df-47ad-9c14-7b61ec0ed691" (UID: "75f571ee-03df-47ad-9c14-7b61ec0ed691"). InnerVolumeSpecName "kube-api-access-zl9kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.130777 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-kube-api-access-852qj" (OuterVolumeSpecName: "kube-api-access-852qj") pod "7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" (UID: "7a3a86ea-aa6a-4dc4-8aed-794167e95b8a"). InnerVolumeSpecName "kube-api-access-852qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.131439 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c77dc95-2e7f-47c7-821d-c94a13f18f32-kube-api-access-b7wlh" (OuterVolumeSpecName: "kube-api-access-b7wlh") pod "0c77dc95-2e7f-47c7-821d-c94a13f18f32" (UID: "0c77dc95-2e7f-47c7-821d-c94a13f18f32"). InnerVolumeSpecName "kube-api-access-b7wlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.136071 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-kube-api-access-jdxsl" (OuterVolumeSpecName: "kube-api-access-jdxsl") pod "2fdc3ca0-0b83-4b64-bd5c-985658582ae3" (UID: "2fdc3ca0-0b83-4b64-bd5c-985658582ae3"). InnerVolumeSpecName "kube-api-access-jdxsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.144782 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75f571ee-03df-47ad-9c14-7b61ec0ed691" (UID: "75f571ee-03df-47ad-9c14-7b61ec0ed691"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.204598 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" (UID: "7a3a86ea-aa6a-4dc4-8aed-794167e95b8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.208240 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0462ac94-e1a1-456b-89c7-a3a19eb8b80c" (UID: "0462ac94-e1a1-456b-89c7-a3a19eb8b80c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.210026 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jx4gz"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220571 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220599 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220611 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220621 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220632 4990 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220642 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220650 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl9kz\" (UniqueName: \"kubernetes.io/projected/75f571ee-03df-47ad-9c14-7b61ec0ed691-kube-api-access-zl9kz\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220658 4990 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0c77dc95-2e7f-47c7-821d-c94a13f18f32-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220668 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns7mh\" (UniqueName: \"kubernetes.io/projected/0462ac94-e1a1-456b-89c7-a3a19eb8b80c-kube-api-access-ns7mh\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220677 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdxsl\" (UniqueName: \"kubernetes.io/projected/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-kube-api-access-jdxsl\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220685 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f571ee-03df-47ad-9c14-7b61ec0ed691-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220700 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220725 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-852qj\" (UniqueName: \"kubernetes.io/projected/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a-kube-api-access-852qj\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.220734 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7wlh\" (UniqueName: \"kubernetes.io/projected/0c77dc95-2e7f-47c7-821d-c94a13f18f32-kube-api-access-b7wlh\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.246656 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fdc3ca0-0b83-4b64-bd5c-985658582ae3" (UID: "2fdc3ca0-0b83-4b64-bd5c-985658582ae3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.322030 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fdc3ca0-0b83-4b64-bd5c-985658582ae3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.573740 4990 generic.go:334] "Generic (PLEG): container finished" podID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerID="ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d" exitCode=0 Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.573820 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rgvf" event={"ID":"0462ac94-e1a1-456b-89c7-a3a19eb8b80c","Type":"ContainerDied","Data":"ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.573860 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rgvf" event={"ID":"0462ac94-e1a1-456b-89c7-a3a19eb8b80c","Type":"ContainerDied","Data":"c9f095e25895460af208577df0aa7d1fa68724bec0ae652e05bac276677fae48"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.573860 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rgvf" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.573882 4990 scope.go:117] "RemoveContainer" containerID="ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.576902 4990 generic.go:334] "Generic (PLEG): container finished" podID="0c77dc95-2e7f-47c7-821d-c94a13f18f32" containerID="f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e" exitCode=0 Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.576962 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" event={"ID":"0c77dc95-2e7f-47c7-821d-c94a13f18f32","Type":"ContainerDied","Data":"f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.576983 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" event={"ID":"0c77dc95-2e7f-47c7-821d-c94a13f18f32","Type":"ContainerDied","Data":"f49dfab5ba6806841e4b6439652ecdcee8c5b84e11b9ffc312dd8ada3f6f8ff9"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.577061 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-prdsm" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.580554 4990 generic.go:334] "Generic (PLEG): container finished" podID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerID="f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a" exitCode=0 Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.580676 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfgbn" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.580645 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfgbn" event={"ID":"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a","Type":"ContainerDied","Data":"f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.580843 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfgbn" event={"ID":"7a3a86ea-aa6a-4dc4-8aed-794167e95b8a","Type":"ContainerDied","Data":"06e210acbf4e8997bfd470bd98064dec0906f35790680683c578200d6fbe6c0a"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.584415 4990 generic.go:334] "Generic (PLEG): container finished" podID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerID="f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227" exitCode=0 Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.584756 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcp8t" event={"ID":"2fdc3ca0-0b83-4b64-bd5c-985658582ae3","Type":"ContainerDied","Data":"f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.584794 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jcp8t" event={"ID":"2fdc3ca0-0b83-4b64-bd5c-985658582ae3","Type":"ContainerDied","Data":"9ac21dda45428f62266c3560601a550e21a68f6b1fa560192dfc6d228da242bb"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.584869 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jcp8t" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.586504 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" event={"ID":"3f649d78-ed82-4728-99ff-cf70e4d53756","Type":"ContainerStarted","Data":"d75f24d700115f91ab16b8addc5fe1be709a4e0ff0dfcedc702da42226834d7c"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.586578 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" event={"ID":"3f649d78-ed82-4728-99ff-cf70e4d53756","Type":"ContainerStarted","Data":"abfb0d55827aade496f276252f75c867a51f0c2709c71eb72761d6dd38a05bc6"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.587866 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.589243 4990 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jx4gz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.589300 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" podUID="3f649d78-ed82-4728-99ff-cf70e4d53756" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.590232 4990 generic.go:334] "Generic (PLEG): container finished" podID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerID="be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d" exitCode=0 Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.590284 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99zt5" event={"ID":"75f571ee-03df-47ad-9c14-7b61ec0ed691","Type":"ContainerDied","Data":"be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.590315 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99zt5" event={"ID":"75f571ee-03df-47ad-9c14-7b61ec0ed691","Type":"ContainerDied","Data":"72c0d1f53540f638fdef58401ddc8b3e8b9bac6fdcddf09006780a4ef819b552"} Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.590360 4990 scope.go:117] "RemoveContainer" containerID="c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.590501 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99zt5" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.625337 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" podStartSLOduration=1.6253106480000001 podStartE2EDuration="1.625310648s" podCreationTimestamp="2025-10-03 09:48:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:48:07.623299372 +0000 UTC m=+269.419931229" watchObservedRunningTime="2025-10-03 09:48:07.625310648 +0000 UTC m=+269.421942505" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.651581 4990 scope.go:117] "RemoveContainer" containerID="0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.683235 4990 scope.go:117] "RemoveContainer" containerID="ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.683974 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d\": container with ID starting with ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d not found: ID does not exist" containerID="ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.684038 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d"} err="failed to get container status \"ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d\": rpc error: code = NotFound desc = could not find container \"ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d\": container with ID starting with ebecb3669cb3c99b4cdf82b84b8035d0bf67a592c51244922b5eab3d5e054d9d not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.684074 4990 scope.go:117] "RemoveContainer" containerID="c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.684559 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6\": container with ID starting with c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6 not found: ID does not exist" containerID="c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.684648 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6"} err="failed to get container status \"c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6\": rpc error: code = NotFound desc = could not find container \"c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6\": container with ID starting with c5fca86b66b6a941b4b40887b732119a58e4d40c2ea9445bd0a4079aece5efd6 not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.684717 4990 scope.go:117] "RemoveContainer" containerID="0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.685207 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942\": container with ID starting with 0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942 not found: ID does not exist" containerID="0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.685244 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942"} err="failed to get container status \"0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942\": rpc error: code = NotFound desc = could not find container \"0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942\": container with ID starting with 0fbe45ee6c9f76e3cd9b35ae683e0455f66ea0cc77fb4633155633603e51f942 not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.685280 4990 scope.go:117] "RemoveContainer" containerID="f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.702158 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jcp8t"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.705614 4990 scope.go:117] "RemoveContainer" containerID="f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.706126 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e\": container with ID starting with f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e not found: ID does not exist" containerID="f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.706174 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e"} err="failed to get container status \"f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e\": rpc error: code = NotFound desc = could not find container \"f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e\": container with ID starting with f4a808152fae2345f43008a9e5d711094c8be43dbc9e9d771039f4de3d3e7b9e not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.706216 4990 scope.go:117] "RemoveContainer" containerID="f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.708179 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jcp8t"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.725679 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfgbn"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.738817 4990 scope.go:117] "RemoveContainer" containerID="35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.740608 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lfgbn"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.756635 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-prdsm"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.758791 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-prdsm"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.763906 4990 scope.go:117] "RemoveContainer" containerID="ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.768064 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99zt5"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.779645 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-99zt5"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.784717 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rgvf"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.787087 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4rgvf"] Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.789333 4990 scope.go:117] "RemoveContainer" containerID="f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.789918 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a\": container with ID starting with f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a not found: ID does not exist" containerID="f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.789989 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a"} err="failed to get container status \"f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a\": rpc error: code = NotFound desc = could not find container \"f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a\": container with ID starting with f1a07ce377af7b83ed80b87d8377cc141a3dd856ad1894a077fcb6c6f385259a not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.790038 4990 scope.go:117] "RemoveContainer" containerID="35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.790722 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6\": container with ID starting with 35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6 not found: ID does not exist" containerID="35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.790776 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6"} err="failed to get container status \"35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6\": rpc error: code = NotFound desc = could not find container \"35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6\": container with ID starting with 35baf20c7940c5b8a6010113710baaa22d194db745b7716007f575b6a17790d6 not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.790815 4990 scope.go:117] "RemoveContainer" containerID="ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.791122 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe\": container with ID starting with ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe not found: ID does not exist" containerID="ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.791162 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe"} err="failed to get container status \"ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe\": rpc error: code = NotFound desc = could not find container \"ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe\": container with ID starting with ef610545d51dd14f43ca5c21b0111f57f33239d227feb492f7d7e9c18e01c1fe not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.791185 4990 scope.go:117] "RemoveContainer" containerID="f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.809695 4990 scope.go:117] "RemoveContainer" containerID="2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.825716 4990 scope.go:117] "RemoveContainer" containerID="965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.842775 4990 scope.go:117] "RemoveContainer" containerID="f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.843606 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227\": container with ID starting with f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227 not found: ID does not exist" containerID="f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.843663 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227"} err="failed to get container status \"f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227\": rpc error: code = NotFound desc = could not find container \"f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227\": container with ID starting with f43587dcf8d204b38259921257678b7e673b649636c4a994a3b9db7c68500227 not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.843709 4990 scope.go:117] "RemoveContainer" containerID="2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.844185 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932\": container with ID starting with 2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932 not found: ID does not exist" containerID="2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.844214 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932"} err="failed to get container status \"2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932\": rpc error: code = NotFound desc = could not find container \"2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932\": container with ID starting with 2f27cd6fc7d850a997f8ef0572a3a196a93b6c83a21cc162594674bacf2a4932 not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.844231 4990 scope.go:117] "RemoveContainer" containerID="965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.844458 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200\": container with ID starting with 965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200 not found: ID does not exist" containerID="965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.844482 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200"} err="failed to get container status \"965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200\": rpc error: code = NotFound desc = could not find container \"965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200\": container with ID starting with 965c8c987673eb82e8bcefe35f6e87abc65ebede0efa69814fac5576907d2200 not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.844498 4990 scope.go:117] "RemoveContainer" containerID="be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.857285 4990 scope.go:117] "RemoveContainer" containerID="12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.871658 4990 scope.go:117] "RemoveContainer" containerID="196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.887856 4990 scope.go:117] "RemoveContainer" containerID="be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.888377 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d\": container with ID starting with be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d not found: ID does not exist" containerID="be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.888416 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d"} err="failed to get container status \"be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d\": rpc error: code = NotFound desc = could not find container \"be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d\": container with ID starting with be0756a210f61d6dc8f93918c1756d25d95ffc16b74a40f0fae903e24ff7587d not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.888447 4990 scope.go:117] "RemoveContainer" containerID="12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.888815 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441\": container with ID starting with 12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441 not found: ID does not exist" containerID="12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.888862 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441"} err="failed to get container status \"12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441\": rpc error: code = NotFound desc = could not find container \"12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441\": container with ID starting with 12e4815b5db514cff6873ae259c4a9a9c1251473a7f92dbabd5297551347f441 not found: ID does not exist" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.888896 4990 scope.go:117] "RemoveContainer" containerID="196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854" Oct 03 09:48:07 crc kubenswrapper[4990]: E1003 09:48:07.889172 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854\": container with ID starting with 196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854 not found: ID does not exist" containerID="196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854" Oct 03 09:48:07 crc kubenswrapper[4990]: I1003 09:48:07.889210 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854"} err="failed to get container status \"196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854\": rpc error: code = NotFound desc = could not find container \"196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854\": container with ID starting with 196ba518898b849380b85554f41fa4c2046ee916f6a469dd5d40b52ae559a854 not found: ID does not exist" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.605475 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jx4gz" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.694614 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8wnd4"] Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695369 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="extract-utilities" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695390 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="extract-utilities" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695429 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerName="extract-content" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695438 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerName="extract-content" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695449 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695458 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695468 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695476 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695485 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="extract-content" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695492 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="extract-content" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695504 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerName="extract-utilities" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695523 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerName="extract-utilities" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695533 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerName="extract-content" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695540 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerName="extract-content" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695553 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695562 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695578 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="extract-content" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695586 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="extract-content" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695595 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="extract-utilities" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695603 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="extract-utilities" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695615 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerName="extract-utilities" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695623 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerName="extract-utilities" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695632 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c77dc95-2e7f-47c7-821d-c94a13f18f32" containerName="marketplace-operator" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695641 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c77dc95-2e7f-47c7-821d-c94a13f18f32" containerName="marketplace-operator" Oct 03 09:48:08 crc kubenswrapper[4990]: E1003 09:48:08.695656 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695664 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695769 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695782 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695792 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695801 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c77dc95-2e7f-47c7-821d-c94a13f18f32" containerName="marketplace-operator" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.695815 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" containerName="registry-server" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.696943 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.698872 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.700264 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wnd4"] Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.838991 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0d5f333-6e56-46b8-9d2d-031053c96543-utilities\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.839058 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0d5f333-6e56-46b8-9d2d-031053c96543-catalog-content\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.839224 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8hbz\" (UniqueName: \"kubernetes.io/projected/d0d5f333-6e56-46b8-9d2d-031053c96543-kube-api-access-c8hbz\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.883458 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0462ac94-e1a1-456b-89c7-a3a19eb8b80c" path="/var/lib/kubelet/pods/0462ac94-e1a1-456b-89c7-a3a19eb8b80c/volumes" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.884137 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c77dc95-2e7f-47c7-821d-c94a13f18f32" path="/var/lib/kubelet/pods/0c77dc95-2e7f-47c7-821d-c94a13f18f32/volumes" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.884691 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fdc3ca0-0b83-4b64-bd5c-985658582ae3" path="/var/lib/kubelet/pods/2fdc3ca0-0b83-4b64-bd5c-985658582ae3/volumes" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.886186 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f571ee-03df-47ad-9c14-7b61ec0ed691" path="/var/lib/kubelet/pods/75f571ee-03df-47ad-9c14-7b61ec0ed691/volumes" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.887265 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3a86ea-aa6a-4dc4-8aed-794167e95b8a" path="/var/lib/kubelet/pods/7a3a86ea-aa6a-4dc4-8aed-794167e95b8a/volumes" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.899337 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l2brd"] Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.900679 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.906336 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.909880 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2brd"] Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.940883 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8hbz\" (UniqueName: \"kubernetes.io/projected/d0d5f333-6e56-46b8-9d2d-031053c96543-kube-api-access-c8hbz\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.941042 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0d5f333-6e56-46b8-9d2d-031053c96543-utilities\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.941541 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0d5f333-6e56-46b8-9d2d-031053c96543-utilities\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.941605 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0d5f333-6e56-46b8-9d2d-031053c96543-catalog-content\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.941879 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0d5f333-6e56-46b8-9d2d-031053c96543-catalog-content\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:08 crc kubenswrapper[4990]: I1003 09:48:08.960152 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8hbz\" (UniqueName: \"kubernetes.io/projected/d0d5f333-6e56-46b8-9d2d-031053c96543-kube-api-access-c8hbz\") pod \"redhat-marketplace-8wnd4\" (UID: \"d0d5f333-6e56-46b8-9d2d-031053c96543\") " pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.018430 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.042412 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-catalog-content\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.042743 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb5wt\" (UniqueName: \"kubernetes.io/projected/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-kube-api-access-hb5wt\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.042893 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-utilities\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.144852 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5wt\" (UniqueName: \"kubernetes.io/projected/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-kube-api-access-hb5wt\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.144940 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-utilities\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.144995 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-catalog-content\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.145594 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-catalog-content\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.145625 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-utilities\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.171091 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5wt\" (UniqueName: \"kubernetes.io/projected/317eadb4-b7e8-41c2-b7e8-a91b0d4719d1-kube-api-access-hb5wt\") pod \"community-operators-l2brd\" (UID: \"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1\") " pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.214367 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wnd4"] Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.223463 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:09 crc kubenswrapper[4990]: W1003 09:48:09.223932 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0d5f333_6e56_46b8_9d2d_031053c96543.slice/crio-a0716a6cefa0fd1a5a00c5f0cf4139203e583886e172d4bbcfecdd6feb3aa43c WatchSource:0}: Error finding container a0716a6cefa0fd1a5a00c5f0cf4139203e583886e172d4bbcfecdd6feb3aa43c: Status 404 returned error can't find the container with id a0716a6cefa0fd1a5a00c5f0cf4139203e583886e172d4bbcfecdd6feb3aa43c Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.423687 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l2brd"] Oct 03 09:48:09 crc kubenswrapper[4990]: W1003 09:48:09.497060 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod317eadb4_b7e8_41c2_b7e8_a91b0d4719d1.slice/crio-3293cab301d03e6caaa7f7bbe5e66a72e6c0f260f11a8b5f322861747783e88a WatchSource:0}: Error finding container 3293cab301d03e6caaa7f7bbe5e66a72e6c0f260f11a8b5f322861747783e88a: Status 404 returned error can't find the container with id 3293cab301d03e6caaa7f7bbe5e66a72e6c0f260f11a8b5f322861747783e88a Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.613447 4990 generic.go:334] "Generic (PLEG): container finished" podID="d0d5f333-6e56-46b8-9d2d-031053c96543" containerID="bed384b13b9e64213a03fdd826ba5a3d3b2bce9f858e7bc846b18eb687403151" exitCode=0 Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.613892 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wnd4" event={"ID":"d0d5f333-6e56-46b8-9d2d-031053c96543","Type":"ContainerDied","Data":"bed384b13b9e64213a03fdd826ba5a3d3b2bce9f858e7bc846b18eb687403151"} Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.614250 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wnd4" event={"ID":"d0d5f333-6e56-46b8-9d2d-031053c96543","Type":"ContainerStarted","Data":"a0716a6cefa0fd1a5a00c5f0cf4139203e583886e172d4bbcfecdd6feb3aa43c"} Oct 03 09:48:09 crc kubenswrapper[4990]: I1003 09:48:09.617793 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2brd" event={"ID":"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1","Type":"ContainerStarted","Data":"3293cab301d03e6caaa7f7bbe5e66a72e6c0f260f11a8b5f322861747783e88a"} Oct 03 09:48:10 crc kubenswrapper[4990]: I1003 09:48:10.624570 4990 generic.go:334] "Generic (PLEG): container finished" podID="317eadb4-b7e8-41c2-b7e8-a91b0d4719d1" containerID="34200684690f25717d1f22dfa29edc1aaa8d7043e7c4f28d798efb589b6bb689" exitCode=0 Oct 03 09:48:10 crc kubenswrapper[4990]: I1003 09:48:10.624675 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2brd" event={"ID":"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1","Type":"ContainerDied","Data":"34200684690f25717d1f22dfa29edc1aaa8d7043e7c4f28d798efb589b6bb689"} Oct 03 09:48:10 crc kubenswrapper[4990]: I1003 09:48:10.628161 4990 generic.go:334] "Generic (PLEG): container finished" podID="d0d5f333-6e56-46b8-9d2d-031053c96543" containerID="4a52d3394d71b0298ef07778c7a0800fa6afe77b292e38e16d11d5c71432a535" exitCode=0 Oct 03 09:48:10 crc kubenswrapper[4990]: I1003 09:48:10.628185 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wnd4" event={"ID":"d0d5f333-6e56-46b8-9d2d-031053c96543","Type":"ContainerDied","Data":"4a52d3394d71b0298ef07778c7a0800fa6afe77b292e38e16d11d5c71432a535"} Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.096440 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m6jsd"] Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.097912 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.100751 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.113035 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6jsd"] Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.176324 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26f2805e-e269-4ee9-8e83-2513cea31add-catalog-content\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.176396 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26f2805e-e269-4ee9-8e83-2513cea31add-utilities\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.176489 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9qzm\" (UniqueName: \"kubernetes.io/projected/26f2805e-e269-4ee9-8e83-2513cea31add-kube-api-access-k9qzm\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.277553 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26f2805e-e269-4ee9-8e83-2513cea31add-utilities\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.278034 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26f2805e-e269-4ee9-8e83-2513cea31add-catalog-content\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.278075 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9qzm\" (UniqueName: \"kubernetes.io/projected/26f2805e-e269-4ee9-8e83-2513cea31add-kube-api-access-k9qzm\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.279003 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26f2805e-e269-4ee9-8e83-2513cea31add-utilities\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.279296 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26f2805e-e269-4ee9-8e83-2513cea31add-catalog-content\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.299530 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xgt6j"] Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.300948 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.302970 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9qzm\" (UniqueName: \"kubernetes.io/projected/26f2805e-e269-4ee9-8e83-2513cea31add-kube-api-access-k9qzm\") pod \"redhat-operators-m6jsd\" (UID: \"26f2805e-e269-4ee9-8e83-2513cea31add\") " pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.305056 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.317369 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgt6j"] Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.380010 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-catalog-content\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.380384 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbfj5\" (UniqueName: \"kubernetes.io/projected/d13713c9-9230-46ad-8b93-5ea52833d53e-kube-api-access-jbfj5\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.380761 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-utilities\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.439821 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.482176 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-utilities\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.482293 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-catalog-content\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.482335 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbfj5\" (UniqueName: \"kubernetes.io/projected/d13713c9-9230-46ad-8b93-5ea52833d53e-kube-api-access-jbfj5\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.483317 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-utilities\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.483634 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-catalog-content\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.506243 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbfj5\" (UniqueName: \"kubernetes.io/projected/d13713c9-9230-46ad-8b93-5ea52833d53e-kube-api-access-jbfj5\") pod \"certified-operators-xgt6j\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.640767 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wnd4" event={"ID":"d0d5f333-6e56-46b8-9d2d-031053c96543","Type":"ContainerStarted","Data":"370deee5ee81ca2410ad5fbb9fe2efd0e7f700cc23dd5a2ab70e74ed48e1da7b"} Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.642160 4990 generic.go:334] "Generic (PLEG): container finished" podID="317eadb4-b7e8-41c2-b7e8-a91b0d4719d1" containerID="f7e76c5fe1596f5e6bbe8824deefb07b032e3855fa64c2be743ab01d50e69ddd" exitCode=0 Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.642205 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2brd" event={"ID":"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1","Type":"ContainerDied","Data":"f7e76c5fe1596f5e6bbe8824deefb07b032e3855fa64c2be743ab01d50e69ddd"} Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.662542 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8wnd4" podStartSLOduration=2.263702423 podStartE2EDuration="3.662499801s" podCreationTimestamp="2025-10-03 09:48:08 +0000 UTC" firstStartedPulling="2025-10-03 09:48:09.615373294 +0000 UTC m=+271.412005151" lastFinishedPulling="2025-10-03 09:48:11.014170652 +0000 UTC m=+272.810802529" observedRunningTime="2025-10-03 09:48:11.661205134 +0000 UTC m=+273.457836991" watchObservedRunningTime="2025-10-03 09:48:11.662499801 +0000 UTC m=+273.459131668" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.665744 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.709679 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6jsd"] Oct 03 09:48:11 crc kubenswrapper[4990]: I1003 09:48:11.921371 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgt6j"] Oct 03 09:48:11 crc kubenswrapper[4990]: W1003 09:48:11.984432 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13713c9_9230_46ad_8b93_5ea52833d53e.slice/crio-42b6917ab0d6e650041bd6e641c9ae655a96e0bd502a6e515fbc409bca811757 WatchSource:0}: Error finding container 42b6917ab0d6e650041bd6e641c9ae655a96e0bd502a6e515fbc409bca811757: Status 404 returned error can't find the container with id 42b6917ab0d6e650041bd6e641c9ae655a96e0bd502a6e515fbc409bca811757 Oct 03 09:48:12 crc kubenswrapper[4990]: I1003 09:48:12.665769 4990 generic.go:334] "Generic (PLEG): container finished" podID="26f2805e-e269-4ee9-8e83-2513cea31add" containerID="a5f273019ef1fddc54fd508ed3194d876d1978709fb5760ffc94afd0d054b072" exitCode=0 Oct 03 09:48:12 crc kubenswrapper[4990]: I1003 09:48:12.666572 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jsd" event={"ID":"26f2805e-e269-4ee9-8e83-2513cea31add","Type":"ContainerDied","Data":"a5f273019ef1fddc54fd508ed3194d876d1978709fb5760ffc94afd0d054b072"} Oct 03 09:48:12 crc kubenswrapper[4990]: I1003 09:48:12.667496 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jsd" event={"ID":"26f2805e-e269-4ee9-8e83-2513cea31add","Type":"ContainerStarted","Data":"96e29bf15f75d257b230cd93c64d59858e7a471d6a028a4d3729811b00223b08"} Oct 03 09:48:12 crc kubenswrapper[4990]: I1003 09:48:12.669531 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l2brd" event={"ID":"317eadb4-b7e8-41c2-b7e8-a91b0d4719d1","Type":"ContainerStarted","Data":"f3c13741ac439c2983f62107cd7310bd686286a819bb7aa79595c2658540239f"} Oct 03 09:48:12 crc kubenswrapper[4990]: I1003 09:48:12.674364 4990 generic.go:334] "Generic (PLEG): container finished" podID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerID="dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8" exitCode=0 Oct 03 09:48:12 crc kubenswrapper[4990]: I1003 09:48:12.674547 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgt6j" event={"ID":"d13713c9-9230-46ad-8b93-5ea52833d53e","Type":"ContainerDied","Data":"dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8"} Oct 03 09:48:12 crc kubenswrapper[4990]: I1003 09:48:12.674634 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgt6j" event={"ID":"d13713c9-9230-46ad-8b93-5ea52833d53e","Type":"ContainerStarted","Data":"42b6917ab0d6e650041bd6e641c9ae655a96e0bd502a6e515fbc409bca811757"} Oct 03 09:48:12 crc kubenswrapper[4990]: I1003 09:48:12.741271 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l2brd" podStartSLOduration=3.168289899 podStartE2EDuration="4.741250713s" podCreationTimestamp="2025-10-03 09:48:08 +0000 UTC" firstStartedPulling="2025-10-03 09:48:10.626178302 +0000 UTC m=+272.422810149" lastFinishedPulling="2025-10-03 09:48:12.199139106 +0000 UTC m=+273.995770963" observedRunningTime="2025-10-03 09:48:12.740015629 +0000 UTC m=+274.536647486" watchObservedRunningTime="2025-10-03 09:48:12.741250713 +0000 UTC m=+274.537882570" Oct 03 09:48:14 crc kubenswrapper[4990]: I1003 09:48:14.690286 4990 generic.go:334] "Generic (PLEG): container finished" podID="26f2805e-e269-4ee9-8e83-2513cea31add" containerID="9ee3c5e098a224d422bca8b23b3caa3c94a86b7c37340a55508d180e551659ff" exitCode=0 Oct 03 09:48:14 crc kubenswrapper[4990]: I1003 09:48:14.690375 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jsd" event={"ID":"26f2805e-e269-4ee9-8e83-2513cea31add","Type":"ContainerDied","Data":"9ee3c5e098a224d422bca8b23b3caa3c94a86b7c37340a55508d180e551659ff"} Oct 03 09:48:14 crc kubenswrapper[4990]: I1003 09:48:14.697386 4990 generic.go:334] "Generic (PLEG): container finished" podID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerID="29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60" exitCode=0 Oct 03 09:48:14 crc kubenswrapper[4990]: I1003 09:48:14.697431 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgt6j" event={"ID":"d13713c9-9230-46ad-8b93-5ea52833d53e","Type":"ContainerDied","Data":"29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60"} Oct 03 09:48:16 crc kubenswrapper[4990]: I1003 09:48:16.712420 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6jsd" event={"ID":"26f2805e-e269-4ee9-8e83-2513cea31add","Type":"ContainerStarted","Data":"241f8a6aeb285b398001d07deb6132ae7bd1ebb46fe721ff03ebebd04360979a"} Oct 03 09:48:16 crc kubenswrapper[4990]: I1003 09:48:16.716714 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgt6j" event={"ID":"d13713c9-9230-46ad-8b93-5ea52833d53e","Type":"ContainerStarted","Data":"952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413"} Oct 03 09:48:16 crc kubenswrapper[4990]: I1003 09:48:16.739020 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m6jsd" podStartSLOduration=3.066035197 podStartE2EDuration="5.738903413s" podCreationTimestamp="2025-10-03 09:48:11 +0000 UTC" firstStartedPulling="2025-10-03 09:48:12.66871731 +0000 UTC m=+274.465349167" lastFinishedPulling="2025-10-03 09:48:15.341585526 +0000 UTC m=+277.138217383" observedRunningTime="2025-10-03 09:48:16.737911975 +0000 UTC m=+278.534543862" watchObservedRunningTime="2025-10-03 09:48:16.738903413 +0000 UTC m=+278.535535280" Oct 03 09:48:16 crc kubenswrapper[4990]: I1003 09:48:16.762326 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xgt6j" podStartSLOduration=3.324484964 podStartE2EDuration="5.762304695s" podCreationTimestamp="2025-10-03 09:48:11 +0000 UTC" firstStartedPulling="2025-10-03 09:48:12.678625767 +0000 UTC m=+274.475257624" lastFinishedPulling="2025-10-03 09:48:15.116445498 +0000 UTC m=+276.913077355" observedRunningTime="2025-10-03 09:48:16.758935821 +0000 UTC m=+278.555567678" watchObservedRunningTime="2025-10-03 09:48:16.762304695 +0000 UTC m=+278.558936552" Oct 03 09:48:19 crc kubenswrapper[4990]: I1003 09:48:19.019336 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:19 crc kubenswrapper[4990]: I1003 09:48:19.021988 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:19 crc kubenswrapper[4990]: I1003 09:48:19.069358 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:19 crc kubenswrapper[4990]: I1003 09:48:19.224194 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:19 crc kubenswrapper[4990]: I1003 09:48:19.224284 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:19 crc kubenswrapper[4990]: I1003 09:48:19.270448 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:19 crc kubenswrapper[4990]: I1003 09:48:19.767627 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l2brd" Oct 03 09:48:19 crc kubenswrapper[4990]: I1003 09:48:19.773093 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8wnd4" Oct 03 09:48:21 crc kubenswrapper[4990]: I1003 09:48:21.440956 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:21 crc kubenswrapper[4990]: I1003 09:48:21.441484 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:21 crc kubenswrapper[4990]: I1003 09:48:21.488280 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:48:21 crc kubenswrapper[4990]: I1003 09:48:21.666311 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:21 crc kubenswrapper[4990]: I1003 09:48:21.666466 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:21 crc kubenswrapper[4990]: I1003 09:48:21.703553 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:21 crc kubenswrapper[4990]: I1003 09:48:21.782765 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 09:48:21 crc kubenswrapper[4990]: I1003 09:48:21.783461 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m6jsd" Oct 03 09:49:25 crc kubenswrapper[4990]: I1003 09:49:25.303731 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:49:25 crc kubenswrapper[4990]: I1003 09:49:25.304642 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:49:55 crc kubenswrapper[4990]: I1003 09:49:55.303802 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:49:55 crc kubenswrapper[4990]: I1003 09:49:55.304854 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:50:17 crc kubenswrapper[4990]: I1003 09:50:17.971264 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bc4bx"] Oct 03 09:50:17 crc kubenswrapper[4990]: I1003 09:50:17.973236 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:17 crc kubenswrapper[4990]: I1003 09:50:17.984843 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bc4bx"] Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.123364 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2rm\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-kube-api-access-zq2rm\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.123483 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a233285-552b-4a2c-b193-d758f23e0649-registry-certificates\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.123529 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a233285-552b-4a2c-b193-d758f23e0649-trusted-ca\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.123551 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a233285-552b-4a2c-b193-d758f23e0649-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.123603 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a233285-552b-4a2c-b193-d758f23e0649-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.123630 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-bound-sa-token\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.123653 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-registry-tls\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.123685 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.144431 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.225307 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2rm\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-kube-api-access-zq2rm\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.225400 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a233285-552b-4a2c-b193-d758f23e0649-registry-certificates\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.225441 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a233285-552b-4a2c-b193-d758f23e0649-trusted-ca\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.225471 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a233285-552b-4a2c-b193-d758f23e0649-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.225546 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a233285-552b-4a2c-b193-d758f23e0649-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.225616 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-bound-sa-token\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.225651 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-registry-tls\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.227718 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a233285-552b-4a2c-b193-d758f23e0649-registry-certificates\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.227941 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a233285-552b-4a2c-b193-d758f23e0649-trusted-ca\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.228070 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a233285-552b-4a2c-b193-d758f23e0649-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.233891 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a233285-552b-4a2c-b193-d758f23e0649-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.234139 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-registry-tls\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.242852 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2rm\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-kube-api-access-zq2rm\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.243848 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a233285-552b-4a2c-b193-d758f23e0649-bound-sa-token\") pod \"image-registry-66df7c8f76-bc4bx\" (UID: \"4a233285-552b-4a2c-b193-d758f23e0649\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.295025 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:18 crc kubenswrapper[4990]: I1003 09:50:18.474083 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bc4bx"] Oct 03 09:50:19 crc kubenswrapper[4990]: I1003 09:50:19.421703 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" event={"ID":"4a233285-552b-4a2c-b193-d758f23e0649","Type":"ContainerStarted","Data":"730275b6c60089b187dc26d58779b79e76dd5ff3727f61bcbf9115dd554e94d9"} Oct 03 09:50:19 crc kubenswrapper[4990]: I1003 09:50:19.422173 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:19 crc kubenswrapper[4990]: I1003 09:50:19.423628 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" event={"ID":"4a233285-552b-4a2c-b193-d758f23e0649","Type":"ContainerStarted","Data":"b41fbe140ab5c389a0bd5a59b58a8e27b2fb5a1dca4b132b8f8c8697ff52c480"} Oct 03 09:50:19 crc kubenswrapper[4990]: I1003 09:50:19.446691 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" podStartSLOduration=2.44666594 podStartE2EDuration="2.44666594s" podCreationTimestamp="2025-10-03 09:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:50:19.443331563 +0000 UTC m=+401.239963420" watchObservedRunningTime="2025-10-03 09:50:19.44666594 +0000 UTC m=+401.243297807" Oct 03 09:50:25 crc kubenswrapper[4990]: I1003 09:50:25.304353 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:50:25 crc kubenswrapper[4990]: I1003 09:50:25.305219 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:50:25 crc kubenswrapper[4990]: I1003 09:50:25.305301 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:50:25 crc kubenswrapper[4990]: I1003 09:50:25.306281 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"593e902b7ac753b682276d32de4451bd59f20dd111b69f74af7ec96904872b2e"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:50:25 crc kubenswrapper[4990]: I1003 09:50:25.306387 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://593e902b7ac753b682276d32de4451bd59f20dd111b69f74af7ec96904872b2e" gracePeriod=600 Oct 03 09:50:25 crc kubenswrapper[4990]: I1003 09:50:25.462369 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="593e902b7ac753b682276d32de4451bd59f20dd111b69f74af7ec96904872b2e" exitCode=0 Oct 03 09:50:25 crc kubenswrapper[4990]: I1003 09:50:25.462424 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"593e902b7ac753b682276d32de4451bd59f20dd111b69f74af7ec96904872b2e"} Oct 03 09:50:25 crc kubenswrapper[4990]: I1003 09:50:25.462465 4990 scope.go:117] "RemoveContainer" containerID="859805407f0015f647a9abeff75fc8bf25870c44ec65e6150451a229fd09bf75" Oct 03 09:50:26 crc kubenswrapper[4990]: I1003 09:50:26.470406 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"75ad13ec688a2b527a118289cf1f305adbf3e1b7dffe3fac1e83356acccea657"} Oct 03 09:50:38 crc kubenswrapper[4990]: I1003 09:50:38.304758 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bc4bx" Oct 03 09:50:38 crc kubenswrapper[4990]: I1003 09:50:38.373713 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hmdg"] Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.429462 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" podUID="becdef88-76d3-402a-b26f-23a4cbdf1644" containerName="registry" containerID="cri-o://4847044c81b2e5d05d9f6c1f20532a81232b5a99b0ab18b730f8309def921ef0" gracePeriod=30 Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.718088 4990 generic.go:334] "Generic (PLEG): container finished" podID="becdef88-76d3-402a-b26f-23a4cbdf1644" containerID="4847044c81b2e5d05d9f6c1f20532a81232b5a99b0ab18b730f8309def921ef0" exitCode=0 Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.718200 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" event={"ID":"becdef88-76d3-402a-b26f-23a4cbdf1644","Type":"ContainerDied","Data":"4847044c81b2e5d05d9f6c1f20532a81232b5a99b0ab18b730f8309def921ef0"} Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.774838 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.930279 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-certificates\") pod \"becdef88-76d3-402a-b26f-23a4cbdf1644\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.930370 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-tls\") pod \"becdef88-76d3-402a-b26f-23a4cbdf1644\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.931848 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "becdef88-76d3-402a-b26f-23a4cbdf1644" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.931876 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb4qn\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-kube-api-access-jb4qn\") pod \"becdef88-76d3-402a-b26f-23a4cbdf1644\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.932208 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/becdef88-76d3-402a-b26f-23a4cbdf1644-ca-trust-extracted\") pod \"becdef88-76d3-402a-b26f-23a4cbdf1644\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.932368 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/becdef88-76d3-402a-b26f-23a4cbdf1644-installation-pull-secrets\") pod \"becdef88-76d3-402a-b26f-23a4cbdf1644\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.932581 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"becdef88-76d3-402a-b26f-23a4cbdf1644\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.932722 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-trusted-ca\") pod \"becdef88-76d3-402a-b26f-23a4cbdf1644\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.933076 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-bound-sa-token\") pod \"becdef88-76d3-402a-b26f-23a4cbdf1644\" (UID: \"becdef88-76d3-402a-b26f-23a4cbdf1644\") " Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.933472 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "becdef88-76d3-402a-b26f-23a4cbdf1644" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.933872 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.933960 4990 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.939418 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/becdef88-76d3-402a-b26f-23a4cbdf1644-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "becdef88-76d3-402a-b26f-23a4cbdf1644" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.939976 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "becdef88-76d3-402a-b26f-23a4cbdf1644" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.942835 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-kube-api-access-jb4qn" (OuterVolumeSpecName: "kube-api-access-jb4qn") pod "becdef88-76d3-402a-b26f-23a4cbdf1644" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644"). InnerVolumeSpecName "kube-api-access-jb4qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.942913 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "becdef88-76d3-402a-b26f-23a4cbdf1644" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.952023 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "becdef88-76d3-402a-b26f-23a4cbdf1644" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:51:03 crc kubenswrapper[4990]: I1003 09:51:03.953291 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/becdef88-76d3-402a-b26f-23a4cbdf1644-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "becdef88-76d3-402a-b26f-23a4cbdf1644" (UID: "becdef88-76d3-402a-b26f-23a4cbdf1644"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.034540 4990 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/becdef88-76d3-402a-b26f-23a4cbdf1644-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.034584 4990 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/becdef88-76d3-402a-b26f-23a4cbdf1644-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.034596 4990 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.034606 4990 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.034616 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb4qn\" (UniqueName: \"kubernetes.io/projected/becdef88-76d3-402a-b26f-23a4cbdf1644-kube-api-access-jb4qn\") on node \"crc\" DevicePath \"\"" Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.725858 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" event={"ID":"becdef88-76d3-402a-b26f-23a4cbdf1644","Type":"ContainerDied","Data":"6dfdaab6f1e077bd805538d3cc8e1e57e78fdf9f0006e00636b7b799d99cdc97"} Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.725948 4990 scope.go:117] "RemoveContainer" containerID="4847044c81b2e5d05d9f6c1f20532a81232b5a99b0ab18b730f8309def921ef0" Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.726008 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4hmdg" Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.761763 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hmdg"] Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.766794 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hmdg"] Oct 03 09:51:04 crc kubenswrapper[4990]: I1003 09:51:04.884982 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="becdef88-76d3-402a-b26f-23a4cbdf1644" path="/var/lib/kubelet/pods/becdef88-76d3-402a-b26f-23a4cbdf1644/volumes" Oct 03 09:52:25 crc kubenswrapper[4990]: I1003 09:52:25.303910 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:52:25 crc kubenswrapper[4990]: I1003 09:52:25.304555 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:52:55 crc kubenswrapper[4990]: I1003 09:52:55.304347 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:52:55 crc kubenswrapper[4990]: I1003 09:52:55.305094 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:53:25 crc kubenswrapper[4990]: I1003 09:53:25.304078 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:53:25 crc kubenswrapper[4990]: I1003 09:53:25.305017 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:53:25 crc kubenswrapper[4990]: I1003 09:53:25.305094 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:53:25 crc kubenswrapper[4990]: I1003 09:53:25.305970 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75ad13ec688a2b527a118289cf1f305adbf3e1b7dffe3fac1e83356acccea657"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:53:25 crc kubenswrapper[4990]: I1003 09:53:25.306070 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://75ad13ec688a2b527a118289cf1f305adbf3e1b7dffe3fac1e83356acccea657" gracePeriod=600 Oct 03 09:53:25 crc kubenswrapper[4990]: I1003 09:53:25.573882 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="75ad13ec688a2b527a118289cf1f305adbf3e1b7dffe3fac1e83356acccea657" exitCode=0 Oct 03 09:53:25 crc kubenswrapper[4990]: I1003 09:53:25.573979 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"75ad13ec688a2b527a118289cf1f305adbf3e1b7dffe3fac1e83356acccea657"} Oct 03 09:53:25 crc kubenswrapper[4990]: I1003 09:53:25.574445 4990 scope.go:117] "RemoveContainer" containerID="593e902b7ac753b682276d32de4451bd59f20dd111b69f74af7ec96904872b2e" Oct 03 09:53:26 crc kubenswrapper[4990]: I1003 09:53:26.584259 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"1e2ad6cff8936a8295aabe81e306e21bfb74b8894d7097a04dd75c58a8d9b278"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.310997 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7rqmg"] Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.312379 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="nbdb" containerID="cri-o://c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf" gracePeriod=30 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.312455 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovn-acl-logging" containerID="cri-o://30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d" gracePeriod=30 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.312475 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf" gracePeriod=30 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.312593 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="sbdb" containerID="cri-o://6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600" gracePeriod=30 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.312365 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovn-controller" containerID="cri-o://a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474" gracePeriod=30 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.312563 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kube-rbac-proxy-node" containerID="cri-o://04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2" gracePeriod=30 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.312759 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="northd" containerID="cri-o://5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5" gracePeriod=30 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.393287 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" containerID="cri-o://be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" gracePeriod=30 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.682883 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/3.log" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.687043 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovn-acl-logging/0.log" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.687907 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovn-controller/0.log" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.688656 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756207 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qbw4z"] Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756578 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovn-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756598 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovn-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756611 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756619 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756630 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="nbdb" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756637 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="nbdb" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756647 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756656 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756665 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovn-acl-logging" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756675 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovn-acl-logging" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756684 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756691 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756700 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kube-rbac-proxy-node" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756707 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kube-rbac-proxy-node" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756723 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becdef88-76d3-402a-b26f-23a4cbdf1644" containerName="registry" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756731 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="becdef88-76d3-402a-b26f-23a4cbdf1644" containerName="registry" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756743 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="northd" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756753 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="northd" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756763 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="sbdb" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756769 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="sbdb" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756785 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756793 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756804 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756811 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756819 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kubecfg-setup" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756826 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kubecfg-setup" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.756836 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756843 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756979 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.756992 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kube-rbac-proxy-node" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757010 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="sbdb" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757024 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="northd" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757039 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757048 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757058 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="becdef88-76d3-402a-b26f-23a4cbdf1644" containerName="registry" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757067 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757078 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovn-acl-logging" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757088 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757101 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovn-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757110 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="nbdb" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.757361 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerName="ovnkube-controller" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.760605 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.890809 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-script-lib\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.890879 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-node-log\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.890902 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-slash\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.890921 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-systemd\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.890956 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-etc-openvswitch\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.890977 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-systemd-units\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.890999 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-var-lib-openvswitch\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891020 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-slash" (OuterVolumeSpecName: "host-slash") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891033 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-kubelet\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891055 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovn-node-metrics-cert\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891060 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-node-log" (OuterVolumeSpecName: "node-log") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891076 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-log-socket\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891080 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891108 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891133 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-netns\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891150 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-config\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891182 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-ovn-kubernetes\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891201 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-openvswitch\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891221 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-bin\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891257 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-ovn\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891281 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-netd\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891305 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr7pd\" (UniqueName: \"kubernetes.io/projected/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-kube-api-access-dr7pd\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891344 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-env-overrides\") pod \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\" (UID: \"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c\") " Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891357 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891398 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891464 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-node-log\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891526 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-ovnkube-script-lib\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891549 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-ovn\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891591 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-var-lib-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891612 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-log-socket\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891628 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-env-overrides\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891642 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891675 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891679 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17caf24e-16d1-4404-9ab5-15f698e50c95-ovn-node-metrics-cert\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891715 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-systemd\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891745 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-cni-netd\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891770 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-slash\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891746 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-log-socket" (OuterVolumeSpecName: "log-socket") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891811 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891828 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-etc-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891851 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-ovnkube-config\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891871 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891891 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-cni-bin\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891917 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-kubelet\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891943 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-run-ovn-kubernetes\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.891978 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-systemd-units\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892001 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-run-netns\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892015 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892024 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5752p\" (UniqueName: \"kubernetes.io/projected/17caf24e-16d1-4404-9ab5-15f698e50c95-kube-api-access-5752p\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892087 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892097 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892083 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892111 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892127 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892148 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892152 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892179 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892274 4990 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892289 4990 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892301 4990 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892311 4990 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892321 4990 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892331 4990 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892341 4990 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892351 4990 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892362 4990 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.892372 4990 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.898879 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.900732 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-kube-api-access-dr7pd" (OuterVolumeSpecName: "kube-api-access-dr7pd") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "kube-api-access-dr7pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.906131 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" (UID: "7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.956769 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovnkube-controller/3.log" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.959907 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovn-acl-logging/0.log" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.960481 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7rqmg_7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/ovn-controller/0.log" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961362 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" exitCode=0 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961399 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600" exitCode=0 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961410 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf" exitCode=0 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961420 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5" exitCode=0 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961433 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf" exitCode=0 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961444 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2" exitCode=0 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961442 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961479 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d" exitCode=143 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961494 4990 generic.go:334] "Generic (PLEG): container finished" podID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" containerID="a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474" exitCode=143 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961529 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961543 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961555 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961566 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961582 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961594 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961609 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961615 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961621 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961630 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961635 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961640 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961647 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961653 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961660 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961673 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961681 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961689 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961695 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961704 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961713 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961722 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961729 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961736 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961708 4990 scope.go:117] "RemoveContainer" containerID="be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961742 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962058 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962109 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962128 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962135 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962141 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.961736 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962148 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962246 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962260 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962267 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962273 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962280 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962292 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7rqmg" event={"ID":"7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c","Type":"ContainerDied","Data":"25da2c61b63a282d8c2aca37731a7c787e40adb4e1b679f798cce85d5cfb39d3"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962311 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962320 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962326 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962332 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962338 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962345 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962350 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962358 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962366 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.962372 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.964239 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bspdz_31671a76-378e-4899-89ae-d27e608c3cda/kube-multus/1.log" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.964977 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bspdz_31671a76-378e-4899-89ae-d27e608c3cda/kube-multus/0.log" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.965043 4990 generic.go:334] "Generic (PLEG): container finished" podID="31671a76-378e-4899-89ae-d27e608c3cda" containerID="906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918" exitCode=2 Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.965065 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bspdz" event={"ID":"31671a76-378e-4899-89ae-d27e608c3cda","Type":"ContainerDied","Data":"906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.965077 4990 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132"} Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.965753 4990 scope.go:117] "RemoveContainer" containerID="906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918" Oct 03 09:54:26 crc kubenswrapper[4990]: E1003 09:54:26.966246 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bspdz_openshift-multus(31671a76-378e-4899-89ae-d27e608c3cda)\"" pod="openshift-multus/multus-bspdz" podUID="31671a76-378e-4899-89ae-d27e608c3cda" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.987864 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996588 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-systemd-units\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996649 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-run-netns\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996694 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5752p\" (UniqueName: \"kubernetes.io/projected/17caf24e-16d1-4404-9ab5-15f698e50c95-kube-api-access-5752p\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996715 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-systemd-units\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996733 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996791 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996796 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-node-log\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996824 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-node-log\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996838 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-ovnkube-script-lib\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996867 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-ovn\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996909 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-var-lib-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996938 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-log-socket\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.996959 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-env-overrides\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997006 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17caf24e-16d1-4404-9ab5-15f698e50c95-ovn-node-metrics-cert\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997027 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-systemd\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997062 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-cni-netd\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997090 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-slash\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997118 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-etc-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997140 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-ovnkube-config\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997165 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997190 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-cni-bin\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997197 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-run-netns\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997370 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-slash\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997414 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-systemd\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997467 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-cni-netd\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997848 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-etc-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.997975 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.998043 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-log-socket\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.998112 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-run-ovn\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.998170 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-var-lib-openvswitch\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.998256 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-ovnkube-script-lib\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.998296 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-ovnkube-config\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.998384 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-cni-bin\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999137 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/17caf24e-16d1-4404-9ab5-15f698e50c95-env-overrides\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999231 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-kubelet\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999317 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-run-ovn-kubernetes\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999404 4990 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999427 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999438 4990 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999457 4990 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999482 4990 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999501 4990 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999543 4990 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999557 4990 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999570 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr7pd\" (UniqueName: \"kubernetes.io/projected/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-kube-api-access-dr7pd\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999582 4990 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999587 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-kubelet\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:26 crc kubenswrapper[4990]: I1003 09:54:26.999646 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17caf24e-16d1-4404-9ab5-15f698e50c95-host-run-ovn-kubernetes\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.009031 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/17caf24e-16d1-4404-9ab5-15f698e50c95-ovn-node-metrics-cert\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.017745 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5752p\" (UniqueName: \"kubernetes.io/projected/17caf24e-16d1-4404-9ab5-15f698e50c95-kube-api-access-5752p\") pod \"ovnkube-node-qbw4z\" (UID: \"17caf24e-16d1-4404-9ab5-15f698e50c95\") " pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.022052 4990 scope.go:117] "RemoveContainer" containerID="6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.024934 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7rqmg"] Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.029619 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7rqmg"] Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.039915 4990 scope.go:117] "RemoveContainer" containerID="c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.055306 4990 scope.go:117] "RemoveContainer" containerID="5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.071717 4990 scope.go:117] "RemoveContainer" containerID="a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.084742 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.085929 4990 scope.go:117] "RemoveContainer" containerID="04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.108499 4990 scope.go:117] "RemoveContainer" containerID="30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.132608 4990 scope.go:117] "RemoveContainer" containerID="a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.155666 4990 scope.go:117] "RemoveContainer" containerID="0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.176660 4990 scope.go:117] "RemoveContainer" containerID="be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.178240 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": container with ID starting with be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576 not found: ID does not exist" containerID="be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.178326 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} err="failed to get container status \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": rpc error: code = NotFound desc = could not find container \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": container with ID starting with be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.178374 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.179020 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": container with ID starting with 07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743 not found: ID does not exist" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.179104 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} err="failed to get container status \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": rpc error: code = NotFound desc = could not find container \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": container with ID starting with 07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.179148 4990 scope.go:117] "RemoveContainer" containerID="6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.179552 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": container with ID starting with 6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600 not found: ID does not exist" containerID="6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.179592 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} err="failed to get container status \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": rpc error: code = NotFound desc = could not find container \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": container with ID starting with 6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.179620 4990 scope.go:117] "RemoveContainer" containerID="c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.180069 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": container with ID starting with c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf not found: ID does not exist" containerID="c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.180106 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} err="failed to get container status \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": rpc error: code = NotFound desc = could not find container \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": container with ID starting with c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.180129 4990 scope.go:117] "RemoveContainer" containerID="5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.180454 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": container with ID starting with 5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5 not found: ID does not exist" containerID="5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.180480 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} err="failed to get container status \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": rpc error: code = NotFound desc = could not find container \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": container with ID starting with 5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.180496 4990 scope.go:117] "RemoveContainer" containerID="a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.180948 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": container with ID starting with a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf not found: ID does not exist" containerID="a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.180994 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} err="failed to get container status \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": rpc error: code = NotFound desc = could not find container \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": container with ID starting with a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.181019 4990 scope.go:117] "RemoveContainer" containerID="04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.181325 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": container with ID starting with 04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2 not found: ID does not exist" containerID="04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.181373 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} err="failed to get container status \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": rpc error: code = NotFound desc = could not find container \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": container with ID starting with 04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.181403 4990 scope.go:117] "RemoveContainer" containerID="30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.181855 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": container with ID starting with 30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d not found: ID does not exist" containerID="30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.181975 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} err="failed to get container status \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": rpc error: code = NotFound desc = could not find container \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": container with ID starting with 30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.182008 4990 scope.go:117] "RemoveContainer" containerID="a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.182289 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": container with ID starting with a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474 not found: ID does not exist" containerID="a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.182321 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} err="failed to get container status \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": rpc error: code = NotFound desc = could not find container \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": container with ID starting with a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.182340 4990 scope.go:117] "RemoveContainer" containerID="0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f" Oct 03 09:54:27 crc kubenswrapper[4990]: E1003 09:54:27.182727 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": container with ID starting with 0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f not found: ID does not exist" containerID="0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.182772 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} err="failed to get container status \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": rpc error: code = NotFound desc = could not find container \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": container with ID starting with 0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.182801 4990 scope.go:117] "RemoveContainer" containerID="be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.183279 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} err="failed to get container status \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": rpc error: code = NotFound desc = could not find container \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": container with ID starting with be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.183303 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.183654 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} err="failed to get container status \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": rpc error: code = NotFound desc = could not find container \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": container with ID starting with 07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.183692 4990 scope.go:117] "RemoveContainer" containerID="6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.184043 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} err="failed to get container status \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": rpc error: code = NotFound desc = could not find container \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": container with ID starting with 6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.184071 4990 scope.go:117] "RemoveContainer" containerID="c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.184423 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} err="failed to get container status \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": rpc error: code = NotFound desc = could not find container \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": container with ID starting with c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.184446 4990 scope.go:117] "RemoveContainer" containerID="5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.184845 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} err="failed to get container status \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": rpc error: code = NotFound desc = could not find container \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": container with ID starting with 5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.184871 4990 scope.go:117] "RemoveContainer" containerID="a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.185178 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} err="failed to get container status \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": rpc error: code = NotFound desc = could not find container \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": container with ID starting with a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.185196 4990 scope.go:117] "RemoveContainer" containerID="04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.185626 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} err="failed to get container status \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": rpc error: code = NotFound desc = could not find container \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": container with ID starting with 04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.185659 4990 scope.go:117] "RemoveContainer" containerID="30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.186007 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} err="failed to get container status \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": rpc error: code = NotFound desc = could not find container \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": container with ID starting with 30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.186051 4990 scope.go:117] "RemoveContainer" containerID="a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.186402 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} err="failed to get container status \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": rpc error: code = NotFound desc = could not find container \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": container with ID starting with a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.186427 4990 scope.go:117] "RemoveContainer" containerID="0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.186912 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} err="failed to get container status \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": rpc error: code = NotFound desc = could not find container \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": container with ID starting with 0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.186938 4990 scope.go:117] "RemoveContainer" containerID="be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.187270 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} err="failed to get container status \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": rpc error: code = NotFound desc = could not find container \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": container with ID starting with be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.187290 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.187628 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} err="failed to get container status \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": rpc error: code = NotFound desc = could not find container \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": container with ID starting with 07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.187668 4990 scope.go:117] "RemoveContainer" containerID="6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.188027 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} err="failed to get container status \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": rpc error: code = NotFound desc = could not find container \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": container with ID starting with 6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.188047 4990 scope.go:117] "RemoveContainer" containerID="c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.188317 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} err="failed to get container status \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": rpc error: code = NotFound desc = could not find container \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": container with ID starting with c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.188337 4990 scope.go:117] "RemoveContainer" containerID="5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.188621 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} err="failed to get container status \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": rpc error: code = NotFound desc = could not find container \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": container with ID starting with 5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.188641 4990 scope.go:117] "RemoveContainer" containerID="a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.188927 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} err="failed to get container status \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": rpc error: code = NotFound desc = could not find container \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": container with ID starting with a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.188971 4990 scope.go:117] "RemoveContainer" containerID="04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.189276 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} err="failed to get container status \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": rpc error: code = NotFound desc = could not find container \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": container with ID starting with 04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.189305 4990 scope.go:117] "RemoveContainer" containerID="30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.189630 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} err="failed to get container status \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": rpc error: code = NotFound desc = could not find container \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": container with ID starting with 30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.189653 4990 scope.go:117] "RemoveContainer" containerID="a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.189966 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} err="failed to get container status \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": rpc error: code = NotFound desc = could not find container \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": container with ID starting with a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.189985 4990 scope.go:117] "RemoveContainer" containerID="0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.190330 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} err="failed to get container status \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": rpc error: code = NotFound desc = could not find container \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": container with ID starting with 0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.190349 4990 scope.go:117] "RemoveContainer" containerID="be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.190661 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} err="failed to get container status \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": rpc error: code = NotFound desc = could not find container \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": container with ID starting with be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.190679 4990 scope.go:117] "RemoveContainer" containerID="07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.191004 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743"} err="failed to get container status \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": rpc error: code = NotFound desc = could not find container \"07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743\": container with ID starting with 07345875dc68d834a0a986e36d0ae9db9233d9ec761b3eb3b306cecfae86b743 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.191043 4990 scope.go:117] "RemoveContainer" containerID="6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.191379 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600"} err="failed to get container status \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": rpc error: code = NotFound desc = could not find container \"6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600\": container with ID starting with 6952869bd431b12228e2c15bd9150f22f859ef1a99746040ad7fee419273e600 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.191405 4990 scope.go:117] "RemoveContainer" containerID="c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.191735 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf"} err="failed to get container status \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": rpc error: code = NotFound desc = could not find container \"c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf\": container with ID starting with c430472ba638ff520984f4f78b1d1a7736e19e28b5a1ed357735d40daee9bcaf not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.191755 4990 scope.go:117] "RemoveContainer" containerID="5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.192010 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5"} err="failed to get container status \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": rpc error: code = NotFound desc = could not find container \"5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5\": container with ID starting with 5e07da6ff167aa7782aeb835c626aceb06f2401a41b9e829ac349fccdce173c5 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.192032 4990 scope.go:117] "RemoveContainer" containerID="a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.192312 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf"} err="failed to get container status \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": rpc error: code = NotFound desc = could not find container \"a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf\": container with ID starting with a42f64a8a5eae3cfdc8e113ba01753b23f98ad8cd86ed49919b379db89a7a4cf not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.192328 4990 scope.go:117] "RemoveContainer" containerID="04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.192570 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2"} err="failed to get container status \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": rpc error: code = NotFound desc = could not find container \"04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2\": container with ID starting with 04a33214f03bca896690fbeb279f72d3800cb818b333dde964ace87bd82797f2 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.192591 4990 scope.go:117] "RemoveContainer" containerID="30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.192914 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d"} err="failed to get container status \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": rpc error: code = NotFound desc = could not find container \"30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d\": container with ID starting with 30b9f599b1b38d6f8216ea3852c5fb8843510a880e3a8bede6b64494e3c7c67d not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.192940 4990 scope.go:117] "RemoveContainer" containerID="a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.193233 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474"} err="failed to get container status \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": rpc error: code = NotFound desc = could not find container \"a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474\": container with ID starting with a16291aa1c3acfc7fe539f614471c592cf9e1d0faf40adb382373231b636f474 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.193252 4990 scope.go:117] "RemoveContainer" containerID="0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.193594 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f"} err="failed to get container status \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": rpc error: code = NotFound desc = could not find container \"0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f\": container with ID starting with 0a9c85622dee61a69fa43a92c18a8e9104fe2167856486e87ee1c47e877c956f not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.193633 4990 scope.go:117] "RemoveContainer" containerID="be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.193992 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576"} err="failed to get container status \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": rpc error: code = NotFound desc = could not find container \"be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576\": container with ID starting with be98f89155cefc1a1737694ad0b56ec7a9251185f6f5182ad9dc8d5496677576 not found: ID does not exist" Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.976343 4990 generic.go:334] "Generic (PLEG): container finished" podID="17caf24e-16d1-4404-9ab5-15f698e50c95" containerID="c06b22de268a5050dcf4e788e9b646115e613eb4c6aeea9561fe5ce1a08402c0" exitCode=0 Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.976474 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerDied","Data":"c06b22de268a5050dcf4e788e9b646115e613eb4c6aeea9561fe5ce1a08402c0"} Oct 03 09:54:27 crc kubenswrapper[4990]: I1003 09:54:27.976863 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"b0762720117373f75f9ad83e310e00a987ea6f6d15cc8899ed59e200b86b0717"} Oct 03 09:54:28 crc kubenswrapper[4990]: I1003 09:54:28.882632 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c" path="/var/lib/kubelet/pods/7ae9c9fa-1ce9-42dc-aa7f-e6f80d17a45c/volumes" Oct 03 09:54:28 crc kubenswrapper[4990]: I1003 09:54:28.986388 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"6de402c9af1dc1971ce80198f926a2fb0e92eeef7acc3ef0be248c683052b340"} Oct 03 09:54:28 crc kubenswrapper[4990]: I1003 09:54:28.986443 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"eaf9e75078ee8585feb9e5eda334c426af467638abda290c83455f0da5f07500"} Oct 03 09:54:28 crc kubenswrapper[4990]: I1003 09:54:28.986455 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"3814540f7795329de86fee01a649bbafe717667cfc1fdd00c4cecf2b10834143"} Oct 03 09:54:28 crc kubenswrapper[4990]: I1003 09:54:28.986465 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"02b19da58856f3499c67340d4c318d50705bce5ca3f344fbe4f6894fa55edf16"} Oct 03 09:54:28 crc kubenswrapper[4990]: I1003 09:54:28.986486 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"52ee9c8b5f129cee910d1c585c721b82a29b395976c415ac31be0d0516a4e5be"} Oct 03 09:54:28 crc kubenswrapper[4990]: I1003 09:54:28.986495 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"51a7562501cb87f686c7c312f64c07ecdfa964dcf07c95aa334e5dab0c8a864a"} Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.168944 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-hbzxs"] Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.170770 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.173134 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.173333 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.173662 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.173963 4990 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pfc72" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.175881 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7e729a46-4a98-46a6-bc49-d6f73014b3b5-crc-storage\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.175938 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7e729a46-4a98-46a6-bc49-d6f73014b3b5-node-mnt\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.175980 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rggzg\" (UniqueName: \"kubernetes.io/projected/7e729a46-4a98-46a6-bc49-d6f73014b3b5-kube-api-access-rggzg\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.277353 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7e729a46-4a98-46a6-bc49-d6f73014b3b5-crc-storage\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.277594 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7e729a46-4a98-46a6-bc49-d6f73014b3b5-node-mnt\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.277650 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rggzg\" (UniqueName: \"kubernetes.io/projected/7e729a46-4a98-46a6-bc49-d6f73014b3b5-kube-api-access-rggzg\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.278210 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7e729a46-4a98-46a6-bc49-d6f73014b3b5-node-mnt\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.278555 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7e729a46-4a98-46a6-bc49-d6f73014b3b5-crc-storage\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.310849 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rggzg\" (UniqueName: \"kubernetes.io/projected/7e729a46-4a98-46a6-bc49-d6f73014b3b5-kube-api-access-rggzg\") pod \"crc-storage-crc-hbzxs\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: I1003 09:54:31.489253 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: E1003 09:54:31.518839 4990 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-hbzxs_crc-storage_7e729a46-4a98-46a6-bc49-d6f73014b3b5_0(25f8a075c2d4e386d4a3072edc88214919319c8a4ef815bfce165d6e12059925): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 09:54:31 crc kubenswrapper[4990]: E1003 09:54:31.518942 4990 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-hbzxs_crc-storage_7e729a46-4a98-46a6-bc49-d6f73014b3b5_0(25f8a075c2d4e386d4a3072edc88214919319c8a4ef815bfce165d6e12059925): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: E1003 09:54:31.518966 4990 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-hbzxs_crc-storage_7e729a46-4a98-46a6-bc49-d6f73014b3b5_0(25f8a075c2d4e386d4a3072edc88214919319c8a4ef815bfce165d6e12059925): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:31 crc kubenswrapper[4990]: E1003 09:54:31.519028 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-hbzxs_crc-storage(7e729a46-4a98-46a6-bc49-d6f73014b3b5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-hbzxs_crc-storage(7e729a46-4a98-46a6-bc49-d6f73014b3b5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-hbzxs_crc-storage_7e729a46-4a98-46a6-bc49-d6f73014b3b5_0(25f8a075c2d4e386d4a3072edc88214919319c8a4ef815bfce165d6e12059925): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-hbzxs" podUID="7e729a46-4a98-46a6-bc49-d6f73014b3b5" Oct 03 09:54:32 crc kubenswrapper[4990]: I1003 09:54:32.009705 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"c00f214efd2c262bb4245294774d8a62f6bdd39d4bed82f63e1e257fd9eb80fd"} Oct 03 09:54:34 crc kubenswrapper[4990]: I1003 09:54:34.029088 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" event={"ID":"17caf24e-16d1-4404-9ab5-15f698e50c95","Type":"ContainerStarted","Data":"e044b715bd1f47828d9881fd6996f49c787cce6f407f8261fd972012e27a0ee2"} Oct 03 09:54:34 crc kubenswrapper[4990]: I1003 09:54:34.029683 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:34 crc kubenswrapper[4990]: I1003 09:54:34.030036 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:34 crc kubenswrapper[4990]: I1003 09:54:34.072363 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:34 crc kubenswrapper[4990]: I1003 09:54:34.090666 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" podStartSLOduration=8.090641175 podStartE2EDuration="8.090641175s" podCreationTimestamp="2025-10-03 09:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:54:34.085789829 +0000 UTC m=+655.882421766" watchObservedRunningTime="2025-10-03 09:54:34.090641175 +0000 UTC m=+655.887273052" Oct 03 09:54:34 crc kubenswrapper[4990]: I1003 09:54:34.119224 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-hbzxs"] Oct 03 09:54:34 crc kubenswrapper[4990]: I1003 09:54:34.119353 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:34 crc kubenswrapper[4990]: I1003 09:54:34.119781 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:34 crc kubenswrapper[4990]: E1003 09:54:34.143082 4990 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-hbzxs_crc-storage_7e729a46-4a98-46a6-bc49-d6f73014b3b5_0(739902c8cfd2862e52bd318624f3946ae3fcba42af380efc4ce994946fdeb890): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 09:54:34 crc kubenswrapper[4990]: E1003 09:54:34.143264 4990 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-hbzxs_crc-storage_7e729a46-4a98-46a6-bc49-d6f73014b3b5_0(739902c8cfd2862e52bd318624f3946ae3fcba42af380efc4ce994946fdeb890): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:34 crc kubenswrapper[4990]: E1003 09:54:34.143336 4990 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-hbzxs_crc-storage_7e729a46-4a98-46a6-bc49-d6f73014b3b5_0(739902c8cfd2862e52bd318624f3946ae3fcba42af380efc4ce994946fdeb890): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:34 crc kubenswrapper[4990]: E1003 09:54:34.143448 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-hbzxs_crc-storage(7e729a46-4a98-46a6-bc49-d6f73014b3b5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-hbzxs_crc-storage(7e729a46-4a98-46a6-bc49-d6f73014b3b5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-hbzxs_crc-storage_7e729a46-4a98-46a6-bc49-d6f73014b3b5_0(739902c8cfd2862e52bd318624f3946ae3fcba42af380efc4ce994946fdeb890): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-hbzxs" podUID="7e729a46-4a98-46a6-bc49-d6f73014b3b5" Oct 03 09:54:35 crc kubenswrapper[4990]: I1003 09:54:35.036908 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:35 crc kubenswrapper[4990]: I1003 09:54:35.073923 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:54:38 crc kubenswrapper[4990]: I1003 09:54:38.875915 4990 scope.go:117] "RemoveContainer" containerID="906d1efc6705f32fa0c9efb98709a0ba25ff82d7b550693372b9f4ee90278918" Oct 03 09:54:39 crc kubenswrapper[4990]: I1003 09:54:39.065736 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bspdz_31671a76-378e-4899-89ae-d27e608c3cda/kube-multus/1.log" Oct 03 09:54:39 crc kubenswrapper[4990]: I1003 09:54:39.066902 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bspdz_31671a76-378e-4899-89ae-d27e608c3cda/kube-multus/0.log" Oct 03 09:54:39 crc kubenswrapper[4990]: I1003 09:54:39.214822 4990 scope.go:117] "RemoveContainer" containerID="8980e09d4f496c404ac284f937ab0d8fb2818178388d7bd8bb97dffa10ae5132" Oct 03 09:54:40 crc kubenswrapper[4990]: I1003 09:54:40.077401 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bspdz_31671a76-378e-4899-89ae-d27e608c3cda/kube-multus/1.log" Oct 03 09:54:40 crc kubenswrapper[4990]: I1003 09:54:40.077551 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bspdz" event={"ID":"31671a76-378e-4899-89ae-d27e608c3cda","Type":"ContainerStarted","Data":"3bcf4a1ee6777a8bbb0583318051a82d06c31240b0ee4f9a2fb03e79c43fea74"} Oct 03 09:54:49 crc kubenswrapper[4990]: I1003 09:54:49.871305 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:49 crc kubenswrapper[4990]: I1003 09:54:49.872110 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:50 crc kubenswrapper[4990]: I1003 09:54:50.111896 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-hbzxs"] Oct 03 09:54:50 crc kubenswrapper[4990]: I1003 09:54:50.123587 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:54:50 crc kubenswrapper[4990]: I1003 09:54:50.137577 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hbzxs" event={"ID":"7e729a46-4a98-46a6-bc49-d6f73014b3b5","Type":"ContainerStarted","Data":"9e2d56c64c90cb96c8fa269dc3bb684d982552315e023ddd709ad47218165782"} Oct 03 09:54:52 crc kubenswrapper[4990]: I1003 09:54:52.150837 4990 generic.go:334] "Generic (PLEG): container finished" podID="7e729a46-4a98-46a6-bc49-d6f73014b3b5" containerID="0526b31e28e3b59b231194eb871d1d2d5943d70d62c4984c4c718c80c4aa3110" exitCode=0 Oct 03 09:54:52 crc kubenswrapper[4990]: I1003 09:54:52.150857 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hbzxs" event={"ID":"7e729a46-4a98-46a6-bc49-d6f73014b3b5","Type":"ContainerDied","Data":"0526b31e28e3b59b231194eb871d1d2d5943d70d62c4984c4c718c80c4aa3110"} Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.416312 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.532094 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rggzg\" (UniqueName: \"kubernetes.io/projected/7e729a46-4a98-46a6-bc49-d6f73014b3b5-kube-api-access-rggzg\") pod \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.532271 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7e729a46-4a98-46a6-bc49-d6f73014b3b5-crc-storage\") pod \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.532387 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7e729a46-4a98-46a6-bc49-d6f73014b3b5-node-mnt\") pod \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\" (UID: \"7e729a46-4a98-46a6-bc49-d6f73014b3b5\") " Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.532595 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e729a46-4a98-46a6-bc49-d6f73014b3b5-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7e729a46-4a98-46a6-bc49-d6f73014b3b5" (UID: "7e729a46-4a98-46a6-bc49-d6f73014b3b5"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.533105 4990 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7e729a46-4a98-46a6-bc49-d6f73014b3b5-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.547913 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e729a46-4a98-46a6-bc49-d6f73014b3b5-kube-api-access-rggzg" (OuterVolumeSpecName: "kube-api-access-rggzg") pod "7e729a46-4a98-46a6-bc49-d6f73014b3b5" (UID: "7e729a46-4a98-46a6-bc49-d6f73014b3b5"). InnerVolumeSpecName "kube-api-access-rggzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.548700 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e729a46-4a98-46a6-bc49-d6f73014b3b5-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7e729a46-4a98-46a6-bc49-d6f73014b3b5" (UID: "7e729a46-4a98-46a6-bc49-d6f73014b3b5"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.634691 4990 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7e729a46-4a98-46a6-bc49-d6f73014b3b5-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:53 crc kubenswrapper[4990]: I1003 09:54:53.634738 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rggzg\" (UniqueName: \"kubernetes.io/projected/7e729a46-4a98-46a6-bc49-d6f73014b3b5-kube-api-access-rggzg\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:54 crc kubenswrapper[4990]: I1003 09:54:54.163542 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-hbzxs" event={"ID":"7e729a46-4a98-46a6-bc49-d6f73014b3b5","Type":"ContainerDied","Data":"9e2d56c64c90cb96c8fa269dc3bb684d982552315e023ddd709ad47218165782"} Oct 03 09:54:54 crc kubenswrapper[4990]: I1003 09:54:54.163871 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e2d56c64c90cb96c8fa269dc3bb684d982552315e023ddd709ad47218165782" Oct 03 09:54:54 crc kubenswrapper[4990]: I1003 09:54:54.163609 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-hbzxs" Oct 03 09:54:57 crc kubenswrapper[4990]: I1003 09:54:57.111787 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qbw4z" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.455316 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv"] Oct 03 09:55:00 crc kubenswrapper[4990]: E1003 09:55:00.456028 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e729a46-4a98-46a6-bc49-d6f73014b3b5" containerName="storage" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.456049 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e729a46-4a98-46a6-bc49-d6f73014b3b5" containerName="storage" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.456253 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e729a46-4a98-46a6-bc49-d6f73014b3b5" containerName="storage" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.457296 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.465826 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.473017 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv"] Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.537193 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.537310 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmmv7\" (UniqueName: \"kubernetes.io/projected/f30d9093-2fce-4e5d-a711-4c746b501b60-kube-api-access-kmmv7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.537348 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.638880 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.638970 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.639089 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmmv7\" (UniqueName: \"kubernetes.io/projected/f30d9093-2fce-4e5d-a711-4c746b501b60-kube-api-access-kmmv7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.639633 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.639830 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.674206 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmmv7\" (UniqueName: \"kubernetes.io/projected/f30d9093-2fce-4e5d-a711-4c746b501b60-kube-api-access-kmmv7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:00 crc kubenswrapper[4990]: I1003 09:55:00.781440 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:01 crc kubenswrapper[4990]: I1003 09:55:01.029833 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv"] Oct 03 09:55:01 crc kubenswrapper[4990]: I1003 09:55:01.208765 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" event={"ID":"f30d9093-2fce-4e5d-a711-4c746b501b60","Type":"ContainerStarted","Data":"9d2484639e38238fe3141037e7f823f890306c52f5a7a34680d01020e9506db1"} Oct 03 09:55:01 crc kubenswrapper[4990]: I1003 09:55:01.209284 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" event={"ID":"f30d9093-2fce-4e5d-a711-4c746b501b60","Type":"ContainerStarted","Data":"c6acae8b20b2eb2b4624e29156503f1574c7ffe76363d88d829d6c41ca312ff6"} Oct 03 09:55:02 crc kubenswrapper[4990]: I1003 09:55:02.217152 4990 generic.go:334] "Generic (PLEG): container finished" podID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerID="9d2484639e38238fe3141037e7f823f890306c52f5a7a34680d01020e9506db1" exitCode=0 Oct 03 09:55:02 crc kubenswrapper[4990]: I1003 09:55:02.217242 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" event={"ID":"f30d9093-2fce-4e5d-a711-4c746b501b60","Type":"ContainerDied","Data":"9d2484639e38238fe3141037e7f823f890306c52f5a7a34680d01020e9506db1"} Oct 03 09:55:04 crc kubenswrapper[4990]: I1003 09:55:04.234171 4990 generic.go:334] "Generic (PLEG): container finished" podID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerID="4dfe4299c43206c1c9ac9ce8848f58d1b2720f3dc827759e089eb158861c29c8" exitCode=0 Oct 03 09:55:04 crc kubenswrapper[4990]: I1003 09:55:04.234470 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" event={"ID":"f30d9093-2fce-4e5d-a711-4c746b501b60","Type":"ContainerDied","Data":"4dfe4299c43206c1c9ac9ce8848f58d1b2720f3dc827759e089eb158861c29c8"} Oct 03 09:55:05 crc kubenswrapper[4990]: I1003 09:55:05.242392 4990 generic.go:334] "Generic (PLEG): container finished" podID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerID="cc4fb029fbcc4fe5d541d9aa886b00afb04351ca5dcf90d8848077afb3243549" exitCode=0 Oct 03 09:55:05 crc kubenswrapper[4990]: I1003 09:55:05.242529 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" event={"ID":"f30d9093-2fce-4e5d-a711-4c746b501b60","Type":"ContainerDied","Data":"cc4fb029fbcc4fe5d541d9aa886b00afb04351ca5dcf90d8848077afb3243549"} Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.469826 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.633801 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-bundle\") pod \"f30d9093-2fce-4e5d-a711-4c746b501b60\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.633887 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-util\") pod \"f30d9093-2fce-4e5d-a711-4c746b501b60\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.633945 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmmv7\" (UniqueName: \"kubernetes.io/projected/f30d9093-2fce-4e5d-a711-4c746b501b60-kube-api-access-kmmv7\") pod \"f30d9093-2fce-4e5d-a711-4c746b501b60\" (UID: \"f30d9093-2fce-4e5d-a711-4c746b501b60\") " Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.636680 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-bundle" (OuterVolumeSpecName: "bundle") pod "f30d9093-2fce-4e5d-a711-4c746b501b60" (UID: "f30d9093-2fce-4e5d-a711-4c746b501b60"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.641703 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f30d9093-2fce-4e5d-a711-4c746b501b60-kube-api-access-kmmv7" (OuterVolumeSpecName: "kube-api-access-kmmv7") pod "f30d9093-2fce-4e5d-a711-4c746b501b60" (UID: "f30d9093-2fce-4e5d-a711-4c746b501b60"). InnerVolumeSpecName "kube-api-access-kmmv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.726623 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-util" (OuterVolumeSpecName: "util") pod "f30d9093-2fce-4e5d-a711-4c746b501b60" (UID: "f30d9093-2fce-4e5d-a711-4c746b501b60"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.735805 4990 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.735851 4990 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f30d9093-2fce-4e5d-a711-4c746b501b60-util\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:06 crc kubenswrapper[4990]: I1003 09:55:06.735866 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmmv7\" (UniqueName: \"kubernetes.io/projected/f30d9093-2fce-4e5d-a711-4c746b501b60-kube-api-access-kmmv7\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:07 crc kubenswrapper[4990]: I1003 09:55:07.261942 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" event={"ID":"f30d9093-2fce-4e5d-a711-4c746b501b60","Type":"ContainerDied","Data":"c6acae8b20b2eb2b4624e29156503f1574c7ffe76363d88d829d6c41ca312ff6"} Oct 03 09:55:07 crc kubenswrapper[4990]: I1003 09:55:07.262442 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6acae8b20b2eb2b4624e29156503f1574c7ffe76363d88d829d6c41ca312ff6" Oct 03 09:55:07 crc kubenswrapper[4990]: I1003 09:55:07.262402 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.068345 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r"] Oct 03 09:55:12 crc kubenswrapper[4990]: E1003 09:55:12.069188 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerName="extract" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.069206 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerName="extract" Oct 03 09:55:12 crc kubenswrapper[4990]: E1003 09:55:12.069235 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerName="pull" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.069244 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerName="pull" Oct 03 09:55:12 crc kubenswrapper[4990]: E1003 09:55:12.069269 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerName="util" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.069278 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerName="util" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.069423 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f30d9093-2fce-4e5d-a711-4c746b501b60" containerName="extract" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.070083 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.072349 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.072596 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-65m6s" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.072706 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.082860 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r"] Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.132452 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx4kz\" (UniqueName: \"kubernetes.io/projected/b89c7007-cfc4-42f5-8b2d-024b7d364d54-kube-api-access-dx4kz\") pod \"nmstate-operator-858ddd8f98-4zf9r\" (UID: \"b89c7007-cfc4-42f5-8b2d-024b7d364d54\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.234616 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx4kz\" (UniqueName: \"kubernetes.io/projected/b89c7007-cfc4-42f5-8b2d-024b7d364d54-kube-api-access-dx4kz\") pod \"nmstate-operator-858ddd8f98-4zf9r\" (UID: \"b89c7007-cfc4-42f5-8b2d-024b7d364d54\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.253569 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx4kz\" (UniqueName: \"kubernetes.io/projected/b89c7007-cfc4-42f5-8b2d-024b7d364d54-kube-api-access-dx4kz\") pod \"nmstate-operator-858ddd8f98-4zf9r\" (UID: \"b89c7007-cfc4-42f5-8b2d-024b7d364d54\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.385976 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r" Oct 03 09:55:12 crc kubenswrapper[4990]: I1003 09:55:12.836172 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r"] Oct 03 09:55:13 crc kubenswrapper[4990]: I1003 09:55:13.300042 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r" event={"ID":"b89c7007-cfc4-42f5-8b2d-024b7d364d54","Type":"ContainerStarted","Data":"753e24b2968a8f235214a462da7dab8f9510f2039f79d5d24b5ebb843fe26bdd"} Oct 03 09:55:17 crc kubenswrapper[4990]: I1003 09:55:17.328719 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r" event={"ID":"b89c7007-cfc4-42f5-8b2d-024b7d364d54","Type":"ContainerStarted","Data":"e9f7324dbe44fb54dd9e72b6f645c51e7d9dea19160aa836e3b64651047827f8"} Oct 03 09:55:17 crc kubenswrapper[4990]: I1003 09:55:17.366768 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-4zf9r" podStartSLOduration=1.547975527 podStartE2EDuration="5.366716497s" podCreationTimestamp="2025-10-03 09:55:12 +0000 UTC" firstStartedPulling="2025-10-03 09:55:12.844502241 +0000 UTC m=+694.641134098" lastFinishedPulling="2025-10-03 09:55:16.663243201 +0000 UTC m=+698.459875068" observedRunningTime="2025-10-03 09:55:17.355349081 +0000 UTC m=+699.151981018" watchObservedRunningTime="2025-10-03 09:55:17.366716497 +0000 UTC m=+699.163348384" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.931760 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp"] Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.933542 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.938243 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-49z2d" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.942372 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb"] Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.943769 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.945645 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.957926 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp"] Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.967858 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-ll6ss"] Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.968776 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.975846 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr29g\" (UniqueName: \"kubernetes.io/projected/15431b47-cf1b-425a-807a-7239e2947fff-kube-api-access-wr29g\") pod \"nmstate-metrics-fdff9cb8d-pn5fp\" (UID: \"15431b47-cf1b-425a-807a-7239e2947fff\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.975960 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6b5l\" (UniqueName: \"kubernetes.io/projected/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-kube-api-access-m6b5l\") pod \"nmstate-webhook-6cdbc54649-l94bb\" (UID: \"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.976020 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-l94bb\" (UID: \"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:20 crc kubenswrapper[4990]: I1003 09:55:20.986489 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb"] Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.077656 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr29g\" (UniqueName: \"kubernetes.io/projected/15431b47-cf1b-425a-807a-7239e2947fff-kube-api-access-wr29g\") pod \"nmstate-metrics-fdff9cb8d-pn5fp\" (UID: \"15431b47-cf1b-425a-807a-7239e2947fff\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.078137 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s28sr\" (UniqueName: \"kubernetes.io/projected/f9c86541-a05e-4cee-9a09-2466ca8d588e-kube-api-access-s28sr\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.078165 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-nmstate-lock\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.078207 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-ovs-socket\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.078281 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6b5l\" (UniqueName: \"kubernetes.io/projected/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-kube-api-access-m6b5l\") pod \"nmstate-webhook-6cdbc54649-l94bb\" (UID: \"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.078443 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-dbus-socket\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.078474 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-l94bb\" (UID: \"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:21 crc kubenswrapper[4990]: E1003 09:55:21.078607 4990 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 03 09:55:21 crc kubenswrapper[4990]: E1003 09:55:21.078666 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-tls-key-pair podName:277e4ee9-1a86-464c-b4b8-9dcb1a17e1df nodeName:}" failed. No retries permitted until 2025-10-03 09:55:21.578644777 +0000 UTC m=+703.375276634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-tls-key-pair") pod "nmstate-webhook-6cdbc54649-l94bb" (UID: "277e4ee9-1a86-464c-b4b8-9dcb1a17e1df") : secret "openshift-nmstate-webhook" not found Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.088652 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b"] Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.089739 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.092184 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.092184 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qkt5t" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.095760 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.107336 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b"] Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.120047 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr29g\" (UniqueName: \"kubernetes.io/projected/15431b47-cf1b-425a-807a-7239e2947fff-kube-api-access-wr29g\") pod \"nmstate-metrics-fdff9cb8d-pn5fp\" (UID: \"15431b47-cf1b-425a-807a-7239e2947fff\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.122447 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6b5l\" (UniqueName: \"kubernetes.io/projected/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-kube-api-access-m6b5l\") pod \"nmstate-webhook-6cdbc54649-l94bb\" (UID: \"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.179828 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.179927 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-dbus-socket\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.180007 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqgw\" (UniqueName: \"kubernetes.io/projected/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-kube-api-access-wrqgw\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.180059 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.180098 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s28sr\" (UniqueName: \"kubernetes.io/projected/f9c86541-a05e-4cee-9a09-2466ca8d588e-kube-api-access-s28sr\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.180128 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-nmstate-lock\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.180190 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-ovs-socket\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.180284 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-ovs-socket\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.180375 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-dbus-socket\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.180429 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9c86541-a05e-4cee-9a09-2466ca8d588e-nmstate-lock\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.197324 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28sr\" (UniqueName: \"kubernetes.io/projected/f9c86541-a05e-4cee-9a09-2466ca8d588e-kube-api-access-s28sr\") pod \"nmstate-handler-ll6ss\" (UID: \"f9c86541-a05e-4cee-9a09-2466ca8d588e\") " pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.262310 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.281809 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqgw\" (UniqueName: \"kubernetes.io/projected/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-kube-api-access-wrqgw\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.281880 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.281931 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.283170 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.301048 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.301549 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.310254 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-549c97b65-tgdmn"] Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.310751 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqgw\" (UniqueName: \"kubernetes.io/projected/737d1d1e-6e9c-4d10-b3b9-101bd4b3e440-kube-api-access-wrqgw\") pod \"nmstate-console-plugin-6b874cbd85-fb92b\" (UID: \"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.311407 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.316552 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-549c97b65-tgdmn"] Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.366731 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ll6ss" event={"ID":"f9c86541-a05e-4cee-9a09-2466ca8d588e","Type":"ContainerStarted","Data":"d7c64e77ed563f537d06354c988f052fc9af9d65dafad27ceb3e3b219967bb2d"} Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.383380 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-service-ca\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.383613 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-oauth-serving-cert\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.383652 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-oauth-config\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.383704 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-serving-cert\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.383746 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-config\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.383788 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-trusted-ca-bundle\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.383816 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gw26\" (UniqueName: \"kubernetes.io/projected/f3eb5339-6bf8-44fb-bd87-96e610c60d45-kube-api-access-9gw26\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.409730 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.485344 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-service-ca\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.485387 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-oauth-serving-cert\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.485419 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-oauth-config\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.485443 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-serving-cert\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.485486 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-config\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.485535 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-trusted-ca-bundle\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.485552 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gw26\" (UniqueName: \"kubernetes.io/projected/f3eb5339-6bf8-44fb-bd87-96e610c60d45-kube-api-access-9gw26\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.486392 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-service-ca\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.486721 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-oauth-serving-cert\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.487280 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-config\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.489036 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3eb5339-6bf8-44fb-bd87-96e610c60d45-trusted-ca-bundle\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.490462 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-serving-cert\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.490557 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3eb5339-6bf8-44fb-bd87-96e610c60d45-console-oauth-config\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.507751 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gw26\" (UniqueName: \"kubernetes.io/projected/f3eb5339-6bf8-44fb-bd87-96e610c60d45-kube-api-access-9gw26\") pod \"console-549c97b65-tgdmn\" (UID: \"f3eb5339-6bf8-44fb-bd87-96e610c60d45\") " pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.509245 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp"] Oct 03 09:55:21 crc kubenswrapper[4990]: W1003 09:55:21.521089 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15431b47_cf1b_425a_807a_7239e2947fff.slice/crio-13c230bc443f84d242ac0d6630e960d57a6f241fed22e47d50bca4e299be8c57 WatchSource:0}: Error finding container 13c230bc443f84d242ac0d6630e960d57a6f241fed22e47d50bca4e299be8c57: Status 404 returned error can't find the container with id 13c230bc443f84d242ac0d6630e960d57a6f241fed22e47d50bca4e299be8c57 Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.586409 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-l94bb\" (UID: \"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.591430 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/277e4ee9-1a86-464c-b4b8-9dcb1a17e1df-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-l94bb\" (UID: \"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.641216 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b"] Oct 03 09:55:21 crc kubenswrapper[4990]: W1003 09:55:21.648124 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod737d1d1e_6e9c_4d10_b3b9_101bd4b3e440.slice/crio-4bb2ce1b3e5976abb4c5d1d49abe8e1437c1bc66684d33046b9d79f2e765b8bf WatchSource:0}: Error finding container 4bb2ce1b3e5976abb4c5d1d49abe8e1437c1bc66684d33046b9d79f2e765b8bf: Status 404 returned error can't find the container with id 4bb2ce1b3e5976abb4c5d1d49abe8e1437c1bc66684d33046b9d79f2e765b8bf Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.678944 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.875109 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-549c97b65-tgdmn"] Oct 03 09:55:21 crc kubenswrapper[4990]: I1003 09:55:21.875539 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:21 crc kubenswrapper[4990]: W1003 09:55:21.896009 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3eb5339_6bf8_44fb_bd87_96e610c60d45.slice/crio-55613ea04347767110bab40fe700a9351f187f173b873e2fa01e61a15fc8527e WatchSource:0}: Error finding container 55613ea04347767110bab40fe700a9351f187f173b873e2fa01e61a15fc8527e: Status 404 returned error can't find the container with id 55613ea04347767110bab40fe700a9351f187f173b873e2fa01e61a15fc8527e Oct 03 09:55:22 crc kubenswrapper[4990]: I1003 09:55:22.098093 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb"] Oct 03 09:55:22 crc kubenswrapper[4990]: I1003 09:55:22.376176 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-549c97b65-tgdmn" event={"ID":"f3eb5339-6bf8-44fb-bd87-96e610c60d45","Type":"ContainerStarted","Data":"a13e9a689fdbdd0d0a3132316924fef7d64b9bb523789c3657a61640a3b2e634"} Oct 03 09:55:22 crc kubenswrapper[4990]: I1003 09:55:22.376256 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-549c97b65-tgdmn" event={"ID":"f3eb5339-6bf8-44fb-bd87-96e610c60d45","Type":"ContainerStarted","Data":"55613ea04347767110bab40fe700a9351f187f173b873e2fa01e61a15fc8527e"} Oct 03 09:55:22 crc kubenswrapper[4990]: I1003 09:55:22.378477 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" event={"ID":"15431b47-cf1b-425a-807a-7239e2947fff","Type":"ContainerStarted","Data":"13c230bc443f84d242ac0d6630e960d57a6f241fed22e47d50bca4e299be8c57"} Oct 03 09:55:22 crc kubenswrapper[4990]: I1003 09:55:22.379619 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" event={"ID":"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440","Type":"ContainerStarted","Data":"4bb2ce1b3e5976abb4c5d1d49abe8e1437c1bc66684d33046b9d79f2e765b8bf"} Oct 03 09:55:22 crc kubenswrapper[4990]: I1003 09:55:22.381160 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" event={"ID":"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df","Type":"ContainerStarted","Data":"99852c14f993562a770fa59b0fe086f763d0c591deb936b20482b6f3c4170cdc"} Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.304602 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.305631 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.400604 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" event={"ID":"277e4ee9-1a86-464c-b4b8-9dcb1a17e1df","Type":"ContainerStarted","Data":"716b2a512b695e5f051f7d98ff0a96b134d4ef027b4c8393f28a934c4a2470c3"} Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.402482 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" event={"ID":"15431b47-cf1b-425a-807a-7239e2947fff","Type":"ContainerStarted","Data":"9abcad9677f3438817d8a578e60b314d34532fa84da00fa12ad0e177bd378bc7"} Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.404010 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ll6ss" event={"ID":"f9c86541-a05e-4cee-9a09-2466ca8d588e","Type":"ContainerStarted","Data":"fa69801e1e71ba23cf8d034eecf88fd33fda81d7a80cb5e5550fff25ec47cf8d"} Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.404218 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.405441 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" event={"ID":"737d1d1e-6e9c-4d10-b3b9-101bd4b3e440","Type":"ContainerStarted","Data":"510edb200cce412a9858ebfca4a2cb5357c5d313b00e1b29039d2ae25309bfd4"} Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.428232 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-549c97b65-tgdmn" podStartSLOduration=4.428210329 podStartE2EDuration="4.428210329s" podCreationTimestamp="2025-10-03 09:55:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:55:22.405723521 +0000 UTC m=+704.202355408" watchObservedRunningTime="2025-10-03 09:55:25.428210329 +0000 UTC m=+707.224842206" Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.429206 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" podStartSLOduration=3.171949402 podStartE2EDuration="5.429197425s" podCreationTimestamp="2025-10-03 09:55:20 +0000 UTC" firstStartedPulling="2025-10-03 09:55:22.116215543 +0000 UTC m=+703.912847400" lastFinishedPulling="2025-10-03 09:55:24.373463566 +0000 UTC m=+706.170095423" observedRunningTime="2025-10-03 09:55:25.42362208 +0000 UTC m=+707.220253987" watchObservedRunningTime="2025-10-03 09:55:25.429197425 +0000 UTC m=+707.225829302" Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.457681 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-fb92b" podStartSLOduration=1.740274272 podStartE2EDuration="4.457656486s" podCreationTimestamp="2025-10-03 09:55:21 +0000 UTC" firstStartedPulling="2025-10-03 09:55:21.65099726 +0000 UTC m=+703.447629107" lastFinishedPulling="2025-10-03 09:55:24.368379464 +0000 UTC m=+706.165011321" observedRunningTime="2025-10-03 09:55:25.443676442 +0000 UTC m=+707.240308319" watchObservedRunningTime="2025-10-03 09:55:25.457656486 +0000 UTC m=+707.254288363" Oct 03 09:55:25 crc kubenswrapper[4990]: I1003 09:55:25.470729 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-ll6ss" podStartSLOduration=2.453738592 podStartE2EDuration="5.470712516s" podCreationTimestamp="2025-10-03 09:55:20 +0000 UTC" firstStartedPulling="2025-10-03 09:55:21.366904063 +0000 UTC m=+703.163535920" lastFinishedPulling="2025-10-03 09:55:24.383877987 +0000 UTC m=+706.180509844" observedRunningTime="2025-10-03 09:55:25.465446939 +0000 UTC m=+707.262078796" watchObservedRunningTime="2025-10-03 09:55:25.470712516 +0000 UTC m=+707.267344383" Oct 03 09:55:26 crc kubenswrapper[4990]: I1003 09:55:26.412434 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:27 crc kubenswrapper[4990]: I1003 09:55:27.420834 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" event={"ID":"15431b47-cf1b-425a-807a-7239e2947fff","Type":"ContainerStarted","Data":"8fd284dc9a1f37048675245f1bae301c9f954d0eea5b185c685cc0e69e447c3c"} Oct 03 09:55:27 crc kubenswrapper[4990]: I1003 09:55:27.439314 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-pn5fp" podStartSLOduration=2.255579102 podStartE2EDuration="7.439290523s" podCreationTimestamp="2025-10-03 09:55:20 +0000 UTC" firstStartedPulling="2025-10-03 09:55:21.52311553 +0000 UTC m=+703.319747387" lastFinishedPulling="2025-10-03 09:55:26.706826941 +0000 UTC m=+708.503458808" observedRunningTime="2025-10-03 09:55:27.43762 +0000 UTC m=+709.234251857" watchObservedRunningTime="2025-10-03 09:55:27.439290523 +0000 UTC m=+709.235922380" Oct 03 09:55:31 crc kubenswrapper[4990]: I1003 09:55:31.336403 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-ll6ss" Oct 03 09:55:31 crc kubenswrapper[4990]: I1003 09:55:31.680008 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:31 crc kubenswrapper[4990]: I1003 09:55:31.680078 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:31 crc kubenswrapper[4990]: I1003 09:55:31.685682 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:32 crc kubenswrapper[4990]: I1003 09:55:32.458755 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-549c97b65-tgdmn" Oct 03 09:55:32 crc kubenswrapper[4990]: I1003 09:55:32.515580 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5jgtm"] Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.217444 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5gscv"] Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.218714 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" podUID="7e3d6d60-b354-4999-a205-80a71688caec" containerName="controller-manager" containerID="cri-o://cee26b83a031e00252ee23a6e6d7cc5b66a576c524db357c13b023b9bb550780" gracePeriod=30 Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.316220 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s"] Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.316494 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" podUID="0450636d-e918-475a-af84-690cac4baa47" containerName="route-controller-manager" containerID="cri-o://dc1468edacc205d487a893b5295e4625e0a7b8bf787fed27a291bbca0aed8604" gracePeriod=30 Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.519374 4990 generic.go:334] "Generic (PLEG): container finished" podID="7e3d6d60-b354-4999-a205-80a71688caec" containerID="cee26b83a031e00252ee23a6e6d7cc5b66a576c524db357c13b023b9bb550780" exitCode=0 Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.519469 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" event={"ID":"7e3d6d60-b354-4999-a205-80a71688caec","Type":"ContainerDied","Data":"cee26b83a031e00252ee23a6e6d7cc5b66a576c524db357c13b023b9bb550780"} Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.522258 4990 generic.go:334] "Generic (PLEG): container finished" podID="0450636d-e918-475a-af84-690cac4baa47" containerID="dc1468edacc205d487a893b5295e4625e0a7b8bf787fed27a291bbca0aed8604" exitCode=0 Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.522323 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" event={"ID":"0450636d-e918-475a-af84-690cac4baa47","Type":"ContainerDied","Data":"dc1468edacc205d487a893b5295e4625e0a7b8bf787fed27a291bbca0aed8604"} Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.656464 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.695153 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-config\") pod \"7e3d6d60-b354-4999-a205-80a71688caec\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.695242 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-client-ca\") pod \"7e3d6d60-b354-4999-a205-80a71688caec\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.695295 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3d6d60-b354-4999-a205-80a71688caec-serving-cert\") pod \"7e3d6d60-b354-4999-a205-80a71688caec\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.695355 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8ztc\" (UniqueName: \"kubernetes.io/projected/7e3d6d60-b354-4999-a205-80a71688caec-kube-api-access-x8ztc\") pod \"7e3d6d60-b354-4999-a205-80a71688caec\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.695397 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-proxy-ca-bundles\") pod \"7e3d6d60-b354-4999-a205-80a71688caec\" (UID: \"7e3d6d60-b354-4999-a205-80a71688caec\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.696372 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e3d6d60-b354-4999-a205-80a71688caec" (UID: "7e3d6d60-b354-4999-a205-80a71688caec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.696424 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-config" (OuterVolumeSpecName: "config") pod "7e3d6d60-b354-4999-a205-80a71688caec" (UID: "7e3d6d60-b354-4999-a205-80a71688caec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.696729 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7e3d6d60-b354-4999-a205-80a71688caec" (UID: "7e3d6d60-b354-4999-a205-80a71688caec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.704076 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3d6d60-b354-4999-a205-80a71688caec-kube-api-access-x8ztc" (OuterVolumeSpecName: "kube-api-access-x8ztc") pod "7e3d6d60-b354-4999-a205-80a71688caec" (UID: "7e3d6d60-b354-4999-a205-80a71688caec"). InnerVolumeSpecName "kube-api-access-x8ztc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.706337 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3d6d60-b354-4999-a205-80a71688caec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e3d6d60-b354-4999-a205-80a71688caec" (UID: "7e3d6d60-b354-4999-a205-80a71688caec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.735772 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.796330 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-client-ca\") pod \"0450636d-e918-475a-af84-690cac4baa47\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.796391 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0450636d-e918-475a-af84-690cac4baa47-serving-cert\") pod \"0450636d-e918-475a-af84-690cac4baa47\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.796424 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g29xw\" (UniqueName: \"kubernetes.io/projected/0450636d-e918-475a-af84-690cac4baa47-kube-api-access-g29xw\") pod \"0450636d-e918-475a-af84-690cac4baa47\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.796579 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-config\") pod \"0450636d-e918-475a-af84-690cac4baa47\" (UID: \"0450636d-e918-475a-af84-690cac4baa47\") " Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.796990 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3d6d60-b354-4999-a205-80a71688caec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.797017 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8ztc\" (UniqueName: \"kubernetes.io/projected/7e3d6d60-b354-4999-a205-80a71688caec-kube-api-access-x8ztc\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.797035 4990 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.797050 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.797064 4990 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3d6d60-b354-4999-a205-80a71688caec-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.798018 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-client-ca" (OuterVolumeSpecName: "client-ca") pod "0450636d-e918-475a-af84-690cac4baa47" (UID: "0450636d-e918-475a-af84-690cac4baa47"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.798055 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-config" (OuterVolumeSpecName: "config") pod "0450636d-e918-475a-af84-690cac4baa47" (UID: "0450636d-e918-475a-af84-690cac4baa47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.801133 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0450636d-e918-475a-af84-690cac4baa47-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0450636d-e918-475a-af84-690cac4baa47" (UID: "0450636d-e918-475a-af84-690cac4baa47"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.801788 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0450636d-e918-475a-af84-690cac4baa47-kube-api-access-g29xw" (OuterVolumeSpecName: "kube-api-access-g29xw") pod "0450636d-e918-475a-af84-690cac4baa47" (UID: "0450636d-e918-475a-af84-690cac4baa47"). InnerVolumeSpecName "kube-api-access-g29xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.899380 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.899433 4990 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0450636d-e918-475a-af84-690cac4baa47-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.899453 4990 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0450636d-e918-475a-af84-690cac4baa47-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:39 crc kubenswrapper[4990]: I1003 09:55:39.899472 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g29xw\" (UniqueName: \"kubernetes.io/projected/0450636d-e918-475a-af84-690cac4baa47-kube-api-access-g29xw\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.529986 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" event={"ID":"7e3d6d60-b354-4999-a205-80a71688caec","Type":"ContainerDied","Data":"26e2b8a989c63e4224b789d59de90a40cf5cf96ef07866d60386a8ca3e2bbacb"} Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.530007 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5gscv" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.530077 4990 scope.go:117] "RemoveContainer" containerID="cee26b83a031e00252ee23a6e6d7cc5b66a576c524db357c13b023b9bb550780" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.531318 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" event={"ID":"0450636d-e918-475a-af84-690cac4baa47","Type":"ContainerDied","Data":"93e1943934585663712dd796ae611441ed5a21e7edad0c86f9ea0458163f217c"} Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.531349 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.548260 4990 scope.go:117] "RemoveContainer" containerID="dc1468edacc205d487a893b5295e4625e0a7b8bf787fed27a291bbca0aed8604" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.571369 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s"] Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.581047 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jmd4s"] Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.587175 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5gscv"] Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.592788 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5gscv"] Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.757632 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh"] Oct 03 09:55:40 crc kubenswrapper[4990]: E1003 09:55:40.758013 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0450636d-e918-475a-af84-690cac4baa47" containerName="route-controller-manager" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.758034 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0450636d-e918-475a-af84-690cac4baa47" containerName="route-controller-manager" Oct 03 09:55:40 crc kubenswrapper[4990]: E1003 09:55:40.758048 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3d6d60-b354-4999-a205-80a71688caec" containerName="controller-manager" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.758056 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3d6d60-b354-4999-a205-80a71688caec" containerName="controller-manager" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.758196 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3d6d60-b354-4999-a205-80a71688caec" containerName="controller-manager" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.758219 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0450636d-e918-475a-af84-690cac4baa47" containerName="route-controller-manager" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.758872 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.761904 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g"] Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.762531 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.769040 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.769269 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.769569 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.769884 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.769925 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.770072 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.770239 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.770344 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.770439 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.770398 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.770553 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.770725 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.775066 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh"] Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.778053 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.780997 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g"] Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814212 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4004a3f-e612-4d8e-884e-09844813e10f-serving-cert\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814266 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4004a3f-e612-4d8e-884e-09844813e10f-config\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814299 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4004a3f-e612-4d8e-884e-09844813e10f-client-ca\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814334 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60a694f2-af4e-4589-9a5b-0aafb0e1b646-serving-cert\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814448 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-client-ca\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814537 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-proxy-ca-bundles\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814589 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wt5s\" (UniqueName: \"kubernetes.io/projected/60a694f2-af4e-4589-9a5b-0aafb0e1b646-kube-api-access-6wt5s\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814656 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj4s9\" (UniqueName: \"kubernetes.io/projected/f4004a3f-e612-4d8e-884e-09844813e10f-kube-api-access-cj4s9\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.814707 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-config\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.886104 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0450636d-e918-475a-af84-690cac4baa47" path="/var/lib/kubelet/pods/0450636d-e918-475a-af84-690cac4baa47/volumes" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.886710 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3d6d60-b354-4999-a205-80a71688caec" path="/var/lib/kubelet/pods/7e3d6d60-b354-4999-a205-80a71688caec/volumes" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.916758 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-client-ca\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.916814 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-proxy-ca-bundles\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.916849 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wt5s\" (UniqueName: \"kubernetes.io/projected/60a694f2-af4e-4589-9a5b-0aafb0e1b646-kube-api-access-6wt5s\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.916885 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj4s9\" (UniqueName: \"kubernetes.io/projected/f4004a3f-e612-4d8e-884e-09844813e10f-kube-api-access-cj4s9\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.916912 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-config\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.916948 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4004a3f-e612-4d8e-884e-09844813e10f-serving-cert\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.916972 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4004a3f-e612-4d8e-884e-09844813e10f-config\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.916995 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4004a3f-e612-4d8e-884e-09844813e10f-client-ca\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.917023 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60a694f2-af4e-4589-9a5b-0aafb0e1b646-serving-cert\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.918589 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-proxy-ca-bundles\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.919193 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-config\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.919782 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4004a3f-e612-4d8e-884e-09844813e10f-client-ca\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.920086 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4004a3f-e612-4d8e-884e-09844813e10f-config\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.920888 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60a694f2-af4e-4589-9a5b-0aafb0e1b646-client-ca\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.923607 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4004a3f-e612-4d8e-884e-09844813e10f-serving-cert\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.934340 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60a694f2-af4e-4589-9a5b-0aafb0e1b646-serving-cert\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.940492 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wt5s\" (UniqueName: \"kubernetes.io/projected/60a694f2-af4e-4589-9a5b-0aafb0e1b646-kube-api-access-6wt5s\") pod \"controller-manager-7fc876ffc7-s5mgh\" (UID: \"60a694f2-af4e-4589-9a5b-0aafb0e1b646\") " pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:40 crc kubenswrapper[4990]: I1003 09:55:40.942608 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj4s9\" (UniqueName: \"kubernetes.io/projected/f4004a3f-e612-4d8e-884e-09844813e10f-kube-api-access-cj4s9\") pod \"route-controller-manager-7cd8d94bb4-zr24g\" (UID: \"f4004a3f-e612-4d8e-884e-09844813e10f\") " pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.105423 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.121308 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.377315 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g"] Oct 03 09:55:41 crc kubenswrapper[4990]: W1003 09:55:41.388715 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4004a3f_e612_4d8e_884e_09844813e10f.slice/crio-9c4f84e1f6e15ce886d0a6b32e4e1f6ca0dc307030bd66f846ff775b8cd8317c WatchSource:0}: Error finding container 9c4f84e1f6e15ce886d0a6b32e4e1f6ca0dc307030bd66f846ff775b8cd8317c: Status 404 returned error can't find the container with id 9c4f84e1f6e15ce886d0a6b32e4e1f6ca0dc307030bd66f846ff775b8cd8317c Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.542070 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" event={"ID":"f4004a3f-e612-4d8e-884e-09844813e10f","Type":"ContainerStarted","Data":"c7782ba3d6080bad7583c7bfd350b6910f981bd3f79237067a2f3f35c1eb799b"} Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.542137 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" event={"ID":"f4004a3f-e612-4d8e-884e-09844813e10f","Type":"ContainerStarted","Data":"9c4f84e1f6e15ce886d0a6b32e4e1f6ca0dc307030bd66f846ff775b8cd8317c"} Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.542508 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.542573 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh"] Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.544161 4990 patch_prober.go:28] interesting pod/route-controller-manager-7cd8d94bb4-zr24g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" start-of-body= Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.544210 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" podUID="f4004a3f-e612-4d8e-884e-09844813e10f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.48:8443/healthz\": dial tcp 10.217.0.48:8443: connect: connection refused" Oct 03 09:55:41 crc kubenswrapper[4990]: W1003 09:55:41.556288 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60a694f2_af4e_4589_9a5b_0aafb0e1b646.slice/crio-0ecbee3a11618d22852b883337eabc28cda7f539a82cde57e11653937eabb6df WatchSource:0}: Error finding container 0ecbee3a11618d22852b883337eabc28cda7f539a82cde57e11653937eabb6df: Status 404 returned error can't find the container with id 0ecbee3a11618d22852b883337eabc28cda7f539a82cde57e11653937eabb6df Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.574795 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" podStartSLOduration=2.574765687 podStartE2EDuration="2.574765687s" podCreationTimestamp="2025-10-03 09:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:55:41.573545405 +0000 UTC m=+723.370177262" watchObservedRunningTime="2025-10-03 09:55:41.574765687 +0000 UTC m=+723.371397544" Oct 03 09:55:41 crc kubenswrapper[4990]: I1003 09:55:41.883048 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l94bb" Oct 03 09:55:42 crc kubenswrapper[4990]: I1003 09:55:42.551593 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" event={"ID":"60a694f2-af4e-4589-9a5b-0aafb0e1b646","Type":"ContainerStarted","Data":"53cb17a2584d59dfdb217a61790d9b17f8c14cc9bfa815232e0ac80a4f9cc95e"} Oct 03 09:55:42 crc kubenswrapper[4990]: I1003 09:55:42.551664 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" event={"ID":"60a694f2-af4e-4589-9a5b-0aafb0e1b646","Type":"ContainerStarted","Data":"0ecbee3a11618d22852b883337eabc28cda7f539a82cde57e11653937eabb6df"} Oct 03 09:55:42 crc kubenswrapper[4990]: I1003 09:55:42.557170 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cd8d94bb4-zr24g" Oct 03 09:55:42 crc kubenswrapper[4990]: I1003 09:55:42.575376 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" podStartSLOduration=3.57534516 podStartE2EDuration="3.57534516s" podCreationTimestamp="2025-10-03 09:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:55:42.570768932 +0000 UTC m=+724.367400789" watchObservedRunningTime="2025-10-03 09:55:42.57534516 +0000 UTC m=+724.371977017" Oct 03 09:55:43 crc kubenswrapper[4990]: I1003 09:55:43.559270 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:43 crc kubenswrapper[4990]: I1003 09:55:43.566049 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fc876ffc7-s5mgh" Oct 03 09:55:50 crc kubenswrapper[4990]: I1003 09:55:50.830043 4990 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 09:55:55 crc kubenswrapper[4990]: I1003 09:55:55.304354 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:55:55 crc kubenswrapper[4990]: I1003 09:55:55.305189 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.034899 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j"] Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.036628 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.040722 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.048332 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j"] Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.089260 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.089342 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.089622 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w9vc\" (UniqueName: \"kubernetes.io/projected/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-kube-api-access-6w9vc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.191943 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w9vc\" (UniqueName: \"kubernetes.io/projected/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-kube-api-access-6w9vc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.192357 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.192561 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.193212 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.193259 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.218788 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w9vc\" (UniqueName: \"kubernetes.io/projected/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-kube-api-access-6w9vc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.402129 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.577028 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-5jgtm" podUID="ee5a405d-75fe-4968-881c-62d8c6d0dd5a" containerName="console" containerID="cri-o://304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056" gracePeriod=15 Oct 03 09:55:57 crc kubenswrapper[4990]: I1003 09:55:57.831599 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j"] Oct 03 09:55:57 crc kubenswrapper[4990]: W1003 09:55:57.840184 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86cb1d7_8858_480b_8f13_29f9d3d6a5e9.slice/crio-17f8eb793e5ea5b346c1f2d89a23350c0c8ef8ac98ba59008347fff13109c9f9 WatchSource:0}: Error finding container 17f8eb793e5ea5b346c1f2d89a23350c0c8ef8ac98ba59008347fff13109c9f9: Status 404 returned error can't find the container with id 17f8eb793e5ea5b346c1f2d89a23350c0c8ef8ac98ba59008347fff13109c9f9 Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.035068 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5jgtm_ee5a405d-75fe-4968-881c-62d8c6d0dd5a/console/0.log" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.035666 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.105873 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnsb5\" (UniqueName: \"kubernetes.io/projected/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-kube-api-access-hnsb5\") pod \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.105970 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-oauth-config\") pod \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.105994 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-service-ca\") pod \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.106082 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-oauth-serving-cert\") pod \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.106115 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-config\") pod \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.106137 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-serving-cert\") pod \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.106199 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-trusted-ca-bundle\") pod \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\" (UID: \"ee5a405d-75fe-4968-881c-62d8c6d0dd5a\") " Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.107488 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ee5a405d-75fe-4968-881c-62d8c6d0dd5a" (UID: "ee5a405d-75fe-4968-881c-62d8c6d0dd5a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.107538 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-config" (OuterVolumeSpecName: "console-config") pod "ee5a405d-75fe-4968-881c-62d8c6d0dd5a" (UID: "ee5a405d-75fe-4968-881c-62d8c6d0dd5a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.107725 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-service-ca" (OuterVolumeSpecName: "service-ca") pod "ee5a405d-75fe-4968-881c-62d8c6d0dd5a" (UID: "ee5a405d-75fe-4968-881c-62d8c6d0dd5a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.107759 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ee5a405d-75fe-4968-881c-62d8c6d0dd5a" (UID: "ee5a405d-75fe-4968-881c-62d8c6d0dd5a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.113284 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ee5a405d-75fe-4968-881c-62d8c6d0dd5a" (UID: "ee5a405d-75fe-4968-881c-62d8c6d0dd5a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.113892 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ee5a405d-75fe-4968-881c-62d8c6d0dd5a" (UID: "ee5a405d-75fe-4968-881c-62d8c6d0dd5a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.113949 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-kube-api-access-hnsb5" (OuterVolumeSpecName: "kube-api-access-hnsb5") pod "ee5a405d-75fe-4968-881c-62d8c6d0dd5a" (UID: "ee5a405d-75fe-4968-881c-62d8c6d0dd5a"). InnerVolumeSpecName "kube-api-access-hnsb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.208432 4990 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.208839 4990 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.208903 4990 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.209002 4990 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.209068 4990 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.209127 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnsb5\" (UniqueName: \"kubernetes.io/projected/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-kube-api-access-hnsb5\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.209179 4990 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ee5a405d-75fe-4968-881c-62d8c6d0dd5a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.669039 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5jgtm_ee5a405d-75fe-4968-881c-62d8c6d0dd5a/console/0.log" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.669120 4990 generic.go:334] "Generic (PLEG): container finished" podID="ee5a405d-75fe-4968-881c-62d8c6d0dd5a" containerID="304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056" exitCode=2 Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.669212 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5jgtm" event={"ID":"ee5a405d-75fe-4968-881c-62d8c6d0dd5a","Type":"ContainerDied","Data":"304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056"} Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.669225 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5jgtm" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.669258 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5jgtm" event={"ID":"ee5a405d-75fe-4968-881c-62d8c6d0dd5a","Type":"ContainerDied","Data":"70a0ca07068b717c6349f61faf20f21b9a9819390e97bbabaea02c512c51bf31"} Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.669291 4990 scope.go:117] "RemoveContainer" containerID="304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.671939 4990 generic.go:334] "Generic (PLEG): container finished" podID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerID="07d3301c12c719214686c4728f3a3958a52fb1b0e3203e6031a912ddd99c2ba6" exitCode=0 Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.672038 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" event={"ID":"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9","Type":"ContainerDied","Data":"07d3301c12c719214686c4728f3a3958a52fb1b0e3203e6031a912ddd99c2ba6"} Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.672107 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" event={"ID":"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9","Type":"ContainerStarted","Data":"17f8eb793e5ea5b346c1f2d89a23350c0c8ef8ac98ba59008347fff13109c9f9"} Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.705966 4990 scope.go:117] "RemoveContainer" containerID="304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056" Oct 03 09:55:58 crc kubenswrapper[4990]: E1003 09:55:58.706625 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056\": container with ID starting with 304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056 not found: ID does not exist" containerID="304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.706719 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056"} err="failed to get container status \"304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056\": rpc error: code = NotFound desc = could not find container \"304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056\": container with ID starting with 304bb0be18622fae213a201f3aa27cbba3e10fe5606ca6ee71ebf013172f8056 not found: ID does not exist" Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.740152 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5jgtm"] Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.742921 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-5jgtm"] Oct 03 09:55:58 crc kubenswrapper[4990]: I1003 09:55:58.884305 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee5a405d-75fe-4968-881c-62d8c6d0dd5a" path="/var/lib/kubelet/pods/ee5a405d-75fe-4968-881c-62d8c6d0dd5a/volumes" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.387313 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t2jzv"] Oct 03 09:56:00 crc kubenswrapper[4990]: E1003 09:56:00.389775 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5a405d-75fe-4968-881c-62d8c6d0dd5a" containerName="console" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.389940 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5a405d-75fe-4968-881c-62d8c6d0dd5a" containerName="console" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.390293 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5a405d-75fe-4968-881c-62d8c6d0dd5a" containerName="console" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.396073 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.399943 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2jzv"] Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.454637 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-utilities\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.454716 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rc2\" (UniqueName: \"kubernetes.io/projected/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-kube-api-access-b5rc2\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.454891 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-catalog-content\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.555649 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-catalog-content\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.555744 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-utilities\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.555800 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rc2\" (UniqueName: \"kubernetes.io/projected/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-kube-api-access-b5rc2\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.556239 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-catalog-content\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.556305 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-utilities\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.576791 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rc2\" (UniqueName: \"kubernetes.io/projected/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-kube-api-access-b5rc2\") pod \"redhat-operators-t2jzv\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:00 crc kubenswrapper[4990]: I1003 09:56:00.746017 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:01 crc kubenswrapper[4990]: I1003 09:56:01.196149 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2jzv"] Oct 03 09:56:01 crc kubenswrapper[4990]: W1003 09:56:01.211858 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadadae02_09e8_40ea_a7ff_f4b5b6b320e7.slice/crio-11f4189b0556a4fce95750997e06e6933a9d3fb85eae2fc221537713bdcc3d3a WatchSource:0}: Error finding container 11f4189b0556a4fce95750997e06e6933a9d3fb85eae2fc221537713bdcc3d3a: Status 404 returned error can't find the container with id 11f4189b0556a4fce95750997e06e6933a9d3fb85eae2fc221537713bdcc3d3a Oct 03 09:56:01 crc kubenswrapper[4990]: I1003 09:56:01.698767 4990 generic.go:334] "Generic (PLEG): container finished" podID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerID="8eb8f821aaae7adc7c7d4824e5c3d6a743b78f30f0b313423b83e360bfb50021" exitCode=0 Oct 03 09:56:01 crc kubenswrapper[4990]: I1003 09:56:01.698847 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2jzv" event={"ID":"adadae02-09e8-40ea-a7ff-f4b5b6b320e7","Type":"ContainerDied","Data":"8eb8f821aaae7adc7c7d4824e5c3d6a743b78f30f0b313423b83e360bfb50021"} Oct 03 09:56:01 crc kubenswrapper[4990]: I1003 09:56:01.699304 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2jzv" event={"ID":"adadae02-09e8-40ea-a7ff-f4b5b6b320e7","Type":"ContainerStarted","Data":"11f4189b0556a4fce95750997e06e6933a9d3fb85eae2fc221537713bdcc3d3a"} Oct 03 09:56:01 crc kubenswrapper[4990]: I1003 09:56:01.704985 4990 generic.go:334] "Generic (PLEG): container finished" podID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerID="d7e9ecf075f73ecddebfe7516b2a2df92e56856d076eaae43b63dcfa21d836c3" exitCode=0 Oct 03 09:56:01 crc kubenswrapper[4990]: I1003 09:56:01.705051 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" event={"ID":"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9","Type":"ContainerDied","Data":"d7e9ecf075f73ecddebfe7516b2a2df92e56856d076eaae43b63dcfa21d836c3"} Oct 03 09:56:02 crc kubenswrapper[4990]: I1003 09:56:02.714726 4990 generic.go:334] "Generic (PLEG): container finished" podID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerID="c979d2ae78bdf3276317e66b421220b312271a5c790a10140ce77698f6bef17d" exitCode=0 Oct 03 09:56:02 crc kubenswrapper[4990]: I1003 09:56:02.714847 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" event={"ID":"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9","Type":"ContainerDied","Data":"c979d2ae78bdf3276317e66b421220b312271a5c790a10140ce77698f6bef17d"} Oct 03 09:56:02 crc kubenswrapper[4990]: I1003 09:56:02.719232 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2jzv" event={"ID":"adadae02-09e8-40ea-a7ff-f4b5b6b320e7","Type":"ContainerStarted","Data":"1523c5e0b9d47a021beb8e1fef3d9aaf221c55609c8cf22a00011245d7f96f1e"} Oct 03 09:56:03 crc kubenswrapper[4990]: I1003 09:56:03.730438 4990 generic.go:334] "Generic (PLEG): container finished" podID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerID="1523c5e0b9d47a021beb8e1fef3d9aaf221c55609c8cf22a00011245d7f96f1e" exitCode=0 Oct 03 09:56:03 crc kubenswrapper[4990]: I1003 09:56:03.730596 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2jzv" event={"ID":"adadae02-09e8-40ea-a7ff-f4b5b6b320e7","Type":"ContainerDied","Data":"1523c5e0b9d47a021beb8e1fef3d9aaf221c55609c8cf22a00011245d7f96f1e"} Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.088954 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.113348 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-bundle\") pod \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.113461 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w9vc\" (UniqueName: \"kubernetes.io/projected/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-kube-api-access-6w9vc\") pod \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.113501 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-util\") pod \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\" (UID: \"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9\") " Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.115045 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-bundle" (OuterVolumeSpecName: "bundle") pod "b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" (UID: "b86cb1d7-8858-480b-8f13-29f9d3d6a5e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.120282 4990 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.121912 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-kube-api-access-6w9vc" (OuterVolumeSpecName: "kube-api-access-6w9vc") pod "b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" (UID: "b86cb1d7-8858-480b-8f13-29f9d3d6a5e9"). InnerVolumeSpecName "kube-api-access-6w9vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.124909 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-util" (OuterVolumeSpecName: "util") pod "b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" (UID: "b86cb1d7-8858-480b-8f13-29f9d3d6a5e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.222233 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w9vc\" (UniqueName: \"kubernetes.io/projected/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-kube-api-access-6w9vc\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.222669 4990 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b86cb1d7-8858-480b-8f13-29f9d3d6a5e9-util\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.760204 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" event={"ID":"b86cb1d7-8858-480b-8f13-29f9d3d6a5e9","Type":"ContainerDied","Data":"17f8eb793e5ea5b346c1f2d89a23350c0c8ef8ac98ba59008347fff13109c9f9"} Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.760287 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f8eb793e5ea5b346c1f2d89a23350c0c8ef8ac98ba59008347fff13109c9f9" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.760417 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j" Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.766583 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2jzv" event={"ID":"adadae02-09e8-40ea-a7ff-f4b5b6b320e7","Type":"ContainerStarted","Data":"c8865aaefee9b6f73de7b6a6f42282bdb7ad88d944c575efc54a31650581739a"} Oct 03 09:56:04 crc kubenswrapper[4990]: I1003 09:56:04.792655 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t2jzv" podStartSLOduration=2.324200737 podStartE2EDuration="4.792627911s" podCreationTimestamp="2025-10-03 09:56:00 +0000 UTC" firstStartedPulling="2025-10-03 09:56:01.702204626 +0000 UTC m=+743.498836513" lastFinishedPulling="2025-10-03 09:56:04.17063182 +0000 UTC m=+745.967263687" observedRunningTime="2025-10-03 09:56:04.792329143 +0000 UTC m=+746.588961010" watchObservedRunningTime="2025-10-03 09:56:04.792627911 +0000 UTC m=+746.589259778" Oct 03 09:56:10 crc kubenswrapper[4990]: I1003 09:56:10.746439 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:10 crc kubenswrapper[4990]: I1003 09:56:10.747301 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:10 crc kubenswrapper[4990]: I1003 09:56:10.792494 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:10 crc kubenswrapper[4990]: I1003 09:56:10.850616 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.369660 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2jzv"] Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.369929 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t2jzv" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerName="registry-server" containerID="cri-o://c8865aaefee9b6f73de7b6a6f42282bdb7ad88d944c575efc54a31650581739a" gracePeriod=2 Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.831900 4990 generic.go:334] "Generic (PLEG): container finished" podID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerID="c8865aaefee9b6f73de7b6a6f42282bdb7ad88d944c575efc54a31650581739a" exitCode=0 Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.831995 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2jzv" event={"ID":"adadae02-09e8-40ea-a7ff-f4b5b6b320e7","Type":"ContainerDied","Data":"c8865aaefee9b6f73de7b6a6f42282bdb7ad88d944c575efc54a31650581739a"} Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.906333 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.975929 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-utilities\") pod \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.976039 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5rc2\" (UniqueName: \"kubernetes.io/projected/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-kube-api-access-b5rc2\") pod \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.976064 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-catalog-content\") pod \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\" (UID: \"adadae02-09e8-40ea-a7ff-f4b5b6b320e7\") " Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.977125 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-utilities" (OuterVolumeSpecName: "utilities") pod "adadae02-09e8-40ea-a7ff-f4b5b6b320e7" (UID: "adadae02-09e8-40ea-a7ff-f4b5b6b320e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:56:14 crc kubenswrapper[4990]: I1003 09:56:14.996808 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-kube-api-access-b5rc2" (OuterVolumeSpecName: "kube-api-access-b5rc2") pod "adadae02-09e8-40ea-a7ff-f4b5b6b320e7" (UID: "adadae02-09e8-40ea-a7ff-f4b5b6b320e7"). InnerVolumeSpecName "kube-api-access-b5rc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.071962 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adadae02-09e8-40ea-a7ff-f4b5b6b320e7" (UID: "adadae02-09e8-40ea-a7ff-f4b5b6b320e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.078078 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.078129 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5rc2\" (UniqueName: \"kubernetes.io/projected/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-kube-api-access-b5rc2\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.078141 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adadae02-09e8-40ea-a7ff-f4b5b6b320e7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.843482 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2jzv" event={"ID":"adadae02-09e8-40ea-a7ff-f4b5b6b320e7","Type":"ContainerDied","Data":"11f4189b0556a4fce95750997e06e6933a9d3fb85eae2fc221537713bdcc3d3a"} Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.843591 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2jzv" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.844061 4990 scope.go:117] "RemoveContainer" containerID="c8865aaefee9b6f73de7b6a6f42282bdb7ad88d944c575efc54a31650581739a" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.866277 4990 scope.go:117] "RemoveContainer" containerID="1523c5e0b9d47a021beb8e1fef3d9aaf221c55609c8cf22a00011245d7f96f1e" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.875322 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2jzv"] Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.886451 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t2jzv"] Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.901946 4990 scope.go:117] "RemoveContainer" containerID="8eb8f821aaae7adc7c7d4824e5c3d6a743b78f30f0b313423b83e360bfb50021" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983143 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct"] Oct 03 09:56:15 crc kubenswrapper[4990]: E1003 09:56:15.983385 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerName="util" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983397 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerName="util" Oct 03 09:56:15 crc kubenswrapper[4990]: E1003 09:56:15.983417 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerName="pull" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983423 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerName="pull" Oct 03 09:56:15 crc kubenswrapper[4990]: E1003 09:56:15.983433 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerName="extract" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983440 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerName="extract" Oct 03 09:56:15 crc kubenswrapper[4990]: E1003 09:56:15.983448 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerName="extract-content" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983455 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerName="extract-content" Oct 03 09:56:15 crc kubenswrapper[4990]: E1003 09:56:15.983464 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerName="extract-utilities" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983471 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerName="extract-utilities" Oct 03 09:56:15 crc kubenswrapper[4990]: E1003 09:56:15.983483 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerName="registry-server" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983489 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerName="registry-server" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983627 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" containerName="registry-server" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.983639 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86cb1d7-8858-480b-8f13-29f9d3d6a5e9" containerName="extract" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.984044 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.985896 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.985927 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fczmh" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.986481 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.986772 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 09:56:15 crc kubenswrapper[4990]: I1003 09:56:15.986964 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.005878 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct"] Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.088691 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16ec1111-996b-4343-a1dd-8111dc1ee20f-webhook-cert\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.088768 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9rk7\" (UniqueName: \"kubernetes.io/projected/16ec1111-996b-4343-a1dd-8111dc1ee20f-kube-api-access-v9rk7\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.088815 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16ec1111-996b-4343-a1dd-8111dc1ee20f-apiservice-cert\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.190030 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16ec1111-996b-4343-a1dd-8111dc1ee20f-webhook-cert\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.190107 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9rk7\" (UniqueName: \"kubernetes.io/projected/16ec1111-996b-4343-a1dd-8111dc1ee20f-kube-api-access-v9rk7\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.190150 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16ec1111-996b-4343-a1dd-8111dc1ee20f-apiservice-cert\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.194829 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16ec1111-996b-4343-a1dd-8111dc1ee20f-webhook-cert\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.200943 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16ec1111-996b-4343-a1dd-8111dc1ee20f-apiservice-cert\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.214255 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9rk7\" (UniqueName: \"kubernetes.io/projected/16ec1111-996b-4343-a1dd-8111dc1ee20f-kube-api-access-v9rk7\") pod \"metallb-operator-controller-manager-67bf5f5c84-pddct\" (UID: \"16ec1111-996b-4343-a1dd-8111dc1ee20f\") " pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.234988 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg"] Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.235791 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.238209 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.238493 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-fftql" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.252624 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.255841 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg"] Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.291820 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36b8b036-fee3-4797-b99b-81ebe6f5c659-apiservice-cert\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.291930 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pffm8\" (UniqueName: \"kubernetes.io/projected/36b8b036-fee3-4797-b99b-81ebe6f5c659-kube-api-access-pffm8\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.291979 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36b8b036-fee3-4797-b99b-81ebe6f5c659-webhook-cert\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.298790 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.393659 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36b8b036-fee3-4797-b99b-81ebe6f5c659-webhook-cert\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.393733 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36b8b036-fee3-4797-b99b-81ebe6f5c659-apiservice-cert\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.393807 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pffm8\" (UniqueName: \"kubernetes.io/projected/36b8b036-fee3-4797-b99b-81ebe6f5c659-kube-api-access-pffm8\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.403534 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/36b8b036-fee3-4797-b99b-81ebe6f5c659-apiservice-cert\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.405320 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/36b8b036-fee3-4797-b99b-81ebe6f5c659-webhook-cert\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.422127 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pffm8\" (UniqueName: \"kubernetes.io/projected/36b8b036-fee3-4797-b99b-81ebe6f5c659-kube-api-access-pffm8\") pod \"metallb-operator-webhook-server-7c6bfff67b-b9rrg\" (UID: \"36b8b036-fee3-4797-b99b-81ebe6f5c659\") " pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.576783 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.835208 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct"] Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.866553 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" event={"ID":"16ec1111-996b-4343-a1dd-8111dc1ee20f","Type":"ContainerStarted","Data":"a80a1cd5e709905aba55bf490b1d438908acc0d9280857f1aa3de0643e19d030"} Oct 03 09:56:16 crc kubenswrapper[4990]: I1003 09:56:16.882650 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adadae02-09e8-40ea-a7ff-f4b5b6b320e7" path="/var/lib/kubelet/pods/adadae02-09e8-40ea-a7ff-f4b5b6b320e7/volumes" Oct 03 09:56:17 crc kubenswrapper[4990]: I1003 09:56:17.211409 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg"] Oct 03 09:56:17 crc kubenswrapper[4990]: W1003 09:56:17.222306 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36b8b036_fee3_4797_b99b_81ebe6f5c659.slice/crio-82114d2e3902ed1159ecfb8accd7135414bab65bc130f468c0e593455e4f817e WatchSource:0}: Error finding container 82114d2e3902ed1159ecfb8accd7135414bab65bc130f468c0e593455e4f817e: Status 404 returned error can't find the container with id 82114d2e3902ed1159ecfb8accd7135414bab65bc130f468c0e593455e4f817e Oct 03 09:56:17 crc kubenswrapper[4990]: I1003 09:56:17.873270 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" event={"ID":"36b8b036-fee3-4797-b99b-81ebe6f5c659","Type":"ContainerStarted","Data":"82114d2e3902ed1159ecfb8accd7135414bab65bc130f468c0e593455e4f817e"} Oct 03 09:56:20 crc kubenswrapper[4990]: I1003 09:56:20.899963 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" event={"ID":"16ec1111-996b-4343-a1dd-8111dc1ee20f","Type":"ContainerStarted","Data":"2aea7e2bad670635b8858280e7b44f94678aa779ac13ed94a97b5c1121dc8b22"} Oct 03 09:56:20 crc kubenswrapper[4990]: I1003 09:56:20.900733 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:20 crc kubenswrapper[4990]: I1003 09:56:20.927276 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" podStartSLOduration=2.925408581 podStartE2EDuration="5.927239227s" podCreationTimestamp="2025-10-03 09:56:15 +0000 UTC" firstStartedPulling="2025-10-03 09:56:16.843834685 +0000 UTC m=+758.640466542" lastFinishedPulling="2025-10-03 09:56:19.845665331 +0000 UTC m=+761.642297188" observedRunningTime="2025-10-03 09:56:20.921526009 +0000 UTC m=+762.718157886" watchObservedRunningTime="2025-10-03 09:56:20.927239227 +0000 UTC m=+762.723871084" Oct 03 09:56:21 crc kubenswrapper[4990]: I1003 09:56:21.972303 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8jvb"] Oct 03 09:56:21 crc kubenswrapper[4990]: I1003 09:56:21.973798 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:21 crc kubenswrapper[4990]: I1003 09:56:21.989524 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8jvb"] Oct 03 09:56:21 crc kubenswrapper[4990]: I1003 09:56:21.994418 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-catalog-content\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:21 crc kubenswrapper[4990]: I1003 09:56:21.994533 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-utilities\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:21 crc kubenswrapper[4990]: I1003 09:56:21.994589 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txzfz\" (UniqueName: \"kubernetes.io/projected/7af423a4-a8d6-40e7-882a-3604e0abbd0c-kube-api-access-txzfz\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.096338 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-utilities\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.096420 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txzfz\" (UniqueName: \"kubernetes.io/projected/7af423a4-a8d6-40e7-882a-3604e0abbd0c-kube-api-access-txzfz\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.096476 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-catalog-content\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.097294 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-catalog-content\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.097394 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-utilities\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.118978 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txzfz\" (UniqueName: \"kubernetes.io/projected/7af423a4-a8d6-40e7-882a-3604e0abbd0c-kube-api-access-txzfz\") pod \"certified-operators-r8jvb\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.298942 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.638143 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8jvb"] Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.914658 4990 generic.go:334] "Generic (PLEG): container finished" podID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerID="33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc" exitCode=0 Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.914716 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jvb" event={"ID":"7af423a4-a8d6-40e7-882a-3604e0abbd0c","Type":"ContainerDied","Data":"33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc"} Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.914783 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jvb" event={"ID":"7af423a4-a8d6-40e7-882a-3604e0abbd0c","Type":"ContainerStarted","Data":"ea1cca3b14edbef70695bb61d253f54e78d9fbfad46f945a143f60c274cc6f7f"} Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.916791 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" event={"ID":"36b8b036-fee3-4797-b99b-81ebe6f5c659","Type":"ContainerStarted","Data":"b182ea29575397373913358996ace6622413980174066f81f0330215ef69bd87"} Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.916942 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:22 crc kubenswrapper[4990]: I1003 09:56:22.959253 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" podStartSLOduration=2.023686251 podStartE2EDuration="6.959228756s" podCreationTimestamp="2025-10-03 09:56:16 +0000 UTC" firstStartedPulling="2025-10-03 09:56:17.224897323 +0000 UTC m=+759.021529180" lastFinishedPulling="2025-10-03 09:56:22.160439828 +0000 UTC m=+763.957071685" observedRunningTime="2025-10-03 09:56:22.958329463 +0000 UTC m=+764.754961330" watchObservedRunningTime="2025-10-03 09:56:22.959228756 +0000 UTC m=+764.755860613" Oct 03 09:56:24 crc kubenswrapper[4990]: I1003 09:56:24.931294 4990 generic.go:334] "Generic (PLEG): container finished" podID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerID="35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68" exitCode=0 Oct 03 09:56:24 crc kubenswrapper[4990]: I1003 09:56:24.931346 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jvb" event={"ID":"7af423a4-a8d6-40e7-882a-3604e0abbd0c","Type":"ContainerDied","Data":"35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68"} Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.304026 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.304091 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.304136 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.304685 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e2ad6cff8936a8295aabe81e306e21bfb74b8894d7097a04dd75c58a8d9b278"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.304743 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://1e2ad6cff8936a8295aabe81e306e21bfb74b8894d7097a04dd75c58a8d9b278" gracePeriod=600 Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.939358 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jvb" event={"ID":"7af423a4-a8d6-40e7-882a-3604e0abbd0c","Type":"ContainerStarted","Data":"f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6"} Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.942663 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="1e2ad6cff8936a8295aabe81e306e21bfb74b8894d7097a04dd75c58a8d9b278" exitCode=0 Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.942731 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"1e2ad6cff8936a8295aabe81e306e21bfb74b8894d7097a04dd75c58a8d9b278"} Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.942825 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"f9518ccde27cb06822f3f654faf41c25e94fa95c0d239e9864a7e2a64f0cb2e1"} Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.942855 4990 scope.go:117] "RemoveContainer" containerID="75ad13ec688a2b527a118289cf1f305adbf3e1b7dffe3fac1e83356acccea657" Oct 03 09:56:25 crc kubenswrapper[4990]: I1003 09:56:25.967295 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8jvb" podStartSLOduration=2.5019646399999997 podStartE2EDuration="4.967271824s" podCreationTimestamp="2025-10-03 09:56:21 +0000 UTC" firstStartedPulling="2025-10-03 09:56:22.917028321 +0000 UTC m=+764.713660178" lastFinishedPulling="2025-10-03 09:56:25.382335505 +0000 UTC m=+767.178967362" observedRunningTime="2025-10-03 09:56:25.964097411 +0000 UTC m=+767.760729278" watchObservedRunningTime="2025-10-03 09:56:25.967271824 +0000 UTC m=+767.763903671" Oct 03 09:56:32 crc kubenswrapper[4990]: I1003 09:56:32.299395 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:32 crc kubenswrapper[4990]: I1003 09:56:32.300395 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:32 crc kubenswrapper[4990]: I1003 09:56:32.339617 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:33 crc kubenswrapper[4990]: I1003 09:56:33.039761 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:34 crc kubenswrapper[4990]: I1003 09:56:34.766205 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8jvb"] Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.000870 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8jvb" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerName="registry-server" containerID="cri-o://f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6" gracePeriod=2 Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.412293 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.483439 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txzfz\" (UniqueName: \"kubernetes.io/projected/7af423a4-a8d6-40e7-882a-3604e0abbd0c-kube-api-access-txzfz\") pod \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.483566 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-utilities\") pod \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.483636 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-catalog-content\") pod \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\" (UID: \"7af423a4-a8d6-40e7-882a-3604e0abbd0c\") " Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.484809 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-utilities" (OuterVolumeSpecName: "utilities") pod "7af423a4-a8d6-40e7-882a-3604e0abbd0c" (UID: "7af423a4-a8d6-40e7-882a-3604e0abbd0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.495754 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af423a4-a8d6-40e7-882a-3604e0abbd0c-kube-api-access-txzfz" (OuterVolumeSpecName: "kube-api-access-txzfz") pod "7af423a4-a8d6-40e7-882a-3604e0abbd0c" (UID: "7af423a4-a8d6-40e7-882a-3604e0abbd0c"). InnerVolumeSpecName "kube-api-access-txzfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.567316 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7af423a4-a8d6-40e7-882a-3604e0abbd0c" (UID: "7af423a4-a8d6-40e7-882a-3604e0abbd0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.585265 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txzfz\" (UniqueName: \"kubernetes.io/projected/7af423a4-a8d6-40e7-882a-3604e0abbd0c-kube-api-access-txzfz\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.585305 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:35 crc kubenswrapper[4990]: I1003 09:56:35.585315 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7af423a4-a8d6-40e7-882a-3604e0abbd0c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.010113 4990 generic.go:334] "Generic (PLEG): container finished" podID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerID="f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6" exitCode=0 Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.010193 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jvb" event={"ID":"7af423a4-a8d6-40e7-882a-3604e0abbd0c","Type":"ContainerDied","Data":"f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6"} Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.010230 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8jvb" event={"ID":"7af423a4-a8d6-40e7-882a-3604e0abbd0c","Type":"ContainerDied","Data":"ea1cca3b14edbef70695bb61d253f54e78d9fbfad46f945a143f60c274cc6f7f"} Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.010253 4990 scope.go:117] "RemoveContainer" containerID="f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.010244 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8jvb" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.034318 4990 scope.go:117] "RemoveContainer" containerID="35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.049094 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8jvb"] Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.054220 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8jvb"] Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.057222 4990 scope.go:117] "RemoveContainer" containerID="33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.072995 4990 scope.go:117] "RemoveContainer" containerID="f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6" Oct 03 09:56:36 crc kubenswrapper[4990]: E1003 09:56:36.073396 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6\": container with ID starting with f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6 not found: ID does not exist" containerID="f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.073454 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6"} err="failed to get container status \"f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6\": rpc error: code = NotFound desc = could not find container \"f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6\": container with ID starting with f0a650df377eb797be5f42b45d7ee17f3a19beb780ef0dc66087e56e6fd187e6 not found: ID does not exist" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.073498 4990 scope.go:117] "RemoveContainer" containerID="35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68" Oct 03 09:56:36 crc kubenswrapper[4990]: E1003 09:56:36.073868 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68\": container with ID starting with 35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68 not found: ID does not exist" containerID="35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.073891 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68"} err="failed to get container status \"35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68\": rpc error: code = NotFound desc = could not find container \"35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68\": container with ID starting with 35cc4f5eb0168582d6d6f6d89c77899c5e0eb1a27746856652e2811cd7366b68 not found: ID does not exist" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.073911 4990 scope.go:117] "RemoveContainer" containerID="33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc" Oct 03 09:56:36 crc kubenswrapper[4990]: E1003 09:56:36.074999 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc\": container with ID starting with 33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc not found: ID does not exist" containerID="33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.075031 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc"} err="failed to get container status \"33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc\": rpc error: code = NotFound desc = could not find container \"33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc\": container with ID starting with 33961295ad50e2b8ef42f7903cc32bfddd40a6bb9ed282368fadecc2ae03e4fc not found: ID does not exist" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.589222 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7c6bfff67b-b9rrg" Oct 03 09:56:36 crc kubenswrapper[4990]: I1003 09:56:36.878612 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" path="/var/lib/kubelet/pods/7af423a4-a8d6-40e7-882a-3604e0abbd0c/volumes" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.778839 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7n7ws"] Oct 03 09:56:51 crc kubenswrapper[4990]: E1003 09:56:51.779795 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerName="extract-content" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.779811 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerName="extract-content" Oct 03 09:56:51 crc kubenswrapper[4990]: E1003 09:56:51.779827 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerName="registry-server" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.779833 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerName="registry-server" Oct 03 09:56:51 crc kubenswrapper[4990]: E1003 09:56:51.779849 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerName="extract-utilities" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.779856 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerName="extract-utilities" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.779977 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af423a4-a8d6-40e7-882a-3604e0abbd0c" containerName="registry-server" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.780742 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.791643 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7n7ws"] Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.810438 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-catalog-content\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.810549 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-utilities\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.810589 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-849q5\" (UniqueName: \"kubernetes.io/projected/1cee26c5-8852-4b95-b4df-4fb612977d54-kube-api-access-849q5\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.911494 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-utilities\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.911568 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-849q5\" (UniqueName: \"kubernetes.io/projected/1cee26c5-8852-4b95-b4df-4fb612977d54-kube-api-access-849q5\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.911670 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-catalog-content\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.912130 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-catalog-content\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.912755 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-utilities\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:51 crc kubenswrapper[4990]: I1003 09:56:51.935334 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-849q5\" (UniqueName: \"kubernetes.io/projected/1cee26c5-8852-4b95-b4df-4fb612977d54-kube-api-access-849q5\") pod \"redhat-marketplace-7n7ws\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:52 crc kubenswrapper[4990]: I1003 09:56:52.104642 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:56:52 crc kubenswrapper[4990]: I1003 09:56:52.545634 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7n7ws"] Oct 03 09:56:53 crc kubenswrapper[4990]: I1003 09:56:53.115037 4990 generic.go:334] "Generic (PLEG): container finished" podID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerID="c255bca54619df14a077bb05204ccebcc5bfa409e0f9c0f9ef16dcbebb1c192b" exitCode=0 Oct 03 09:56:53 crc kubenswrapper[4990]: I1003 09:56:53.115115 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7n7ws" event={"ID":"1cee26c5-8852-4b95-b4df-4fb612977d54","Type":"ContainerDied","Data":"c255bca54619df14a077bb05204ccebcc5bfa409e0f9c0f9ef16dcbebb1c192b"} Oct 03 09:56:53 crc kubenswrapper[4990]: I1003 09:56:53.115206 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7n7ws" event={"ID":"1cee26c5-8852-4b95-b4df-4fb612977d54","Type":"ContainerStarted","Data":"46ac8038b6b40c348cc165a23b76b5d9cfb2aa1eec6aa699e708a15bd17dc279"} Oct 03 09:56:53 crc kubenswrapper[4990]: I1003 09:56:53.975313 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x4fzs"] Oct 03 09:56:53 crc kubenswrapper[4990]: I1003 09:56:53.977041 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:53 crc kubenswrapper[4990]: I1003 09:56:53.993142 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x4fzs"] Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.041284 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bkdw\" (UniqueName: \"kubernetes.io/projected/f10b3a78-3252-4460-9e4a-e79afeb681d1-kube-api-access-7bkdw\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.041342 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-catalog-content\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.041381 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-utilities\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.124620 4990 generic.go:334] "Generic (PLEG): container finished" podID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerID="2c673cb891335fe42611f9cc8c5b11af3ccfb13f4dc67d070c1c43deb68c9190" exitCode=0 Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.124656 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7n7ws" event={"ID":"1cee26c5-8852-4b95-b4df-4fb612977d54","Type":"ContainerDied","Data":"2c673cb891335fe42611f9cc8c5b11af3ccfb13f4dc67d070c1c43deb68c9190"} Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.142927 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-utilities\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.143055 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bkdw\" (UniqueName: \"kubernetes.io/projected/f10b3a78-3252-4460-9e4a-e79afeb681d1-kube-api-access-7bkdw\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.143116 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-catalog-content\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.143617 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-utilities\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.143764 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-catalog-content\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.164477 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bkdw\" (UniqueName: \"kubernetes.io/projected/f10b3a78-3252-4460-9e4a-e79afeb681d1-kube-api-access-7bkdw\") pod \"community-operators-x4fzs\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.337023 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:56:54 crc kubenswrapper[4990]: I1003 09:56:54.644445 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x4fzs"] Oct 03 09:56:55 crc kubenswrapper[4990]: I1003 09:56:55.132321 4990 generic.go:334] "Generic (PLEG): container finished" podID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerID="89e034e10465e3472936c27fbb39634d6db5ea147ab386aad010adb428272bdf" exitCode=0 Oct 03 09:56:55 crc kubenswrapper[4990]: I1003 09:56:55.132375 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4fzs" event={"ID":"f10b3a78-3252-4460-9e4a-e79afeb681d1","Type":"ContainerDied","Data":"89e034e10465e3472936c27fbb39634d6db5ea147ab386aad010adb428272bdf"} Oct 03 09:56:55 crc kubenswrapper[4990]: I1003 09:56:55.132425 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4fzs" event={"ID":"f10b3a78-3252-4460-9e4a-e79afeb681d1","Type":"ContainerStarted","Data":"59700c9d8458dc4fca5bd8f1407904f8be4dae4d83804477e0e85ab470c526c6"} Oct 03 09:56:55 crc kubenswrapper[4990]: I1003 09:56:55.135460 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7n7ws" event={"ID":"1cee26c5-8852-4b95-b4df-4fb612977d54","Type":"ContainerStarted","Data":"ee8d484ae9227be525582162b70f3fe0a7f5cdf3665d3af0b5711a28f085b9b4"} Oct 03 09:56:55 crc kubenswrapper[4990]: I1003 09:56:55.173555 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7n7ws" podStartSLOduration=2.742732136 podStartE2EDuration="4.173535404s" podCreationTimestamp="2025-10-03 09:56:51 +0000 UTC" firstStartedPulling="2025-10-03 09:56:53.117864101 +0000 UTC m=+794.914495968" lastFinishedPulling="2025-10-03 09:56:54.548667379 +0000 UTC m=+796.345299236" observedRunningTime="2025-10-03 09:56:55.172194789 +0000 UTC m=+796.968826656" watchObservedRunningTime="2025-10-03 09:56:55.173535404 +0000 UTC m=+796.970167271" Oct 03 09:56:56 crc kubenswrapper[4990]: I1003 09:56:56.145449 4990 generic.go:334] "Generic (PLEG): container finished" podID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerID="d61fca09551d7f6d480e627523de4e901adcedfd108550b223b3ee76a03140b5" exitCode=0 Oct 03 09:56:56 crc kubenswrapper[4990]: I1003 09:56:56.145552 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4fzs" event={"ID":"f10b3a78-3252-4460-9e4a-e79afeb681d1","Type":"ContainerDied","Data":"d61fca09551d7f6d480e627523de4e901adcedfd108550b223b3ee76a03140b5"} Oct 03 09:56:56 crc kubenswrapper[4990]: I1003 09:56:56.303017 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-67bf5f5c84-pddct" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.095623 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44"] Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.097168 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.103486 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.104303 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-qprvb"] Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.106455 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.109592 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44"] Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.113547 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.113835 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.115817 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-65fg7" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.165923 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4fzs" event={"ID":"f10b3a78-3252-4460-9e4a-e79afeb681d1","Type":"ContainerStarted","Data":"07e2e19978fe2628ac8a1e7ad64932fbeca11159cb5c38a9ac913a5716a285da"} Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180308 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdrtp\" (UniqueName: \"kubernetes.io/projected/c3e8a922-2054-46e4-b45e-4c525e08f72f-kube-api-access-xdrtp\") pod \"frr-k8s-webhook-server-64bf5d555-m6s44\" (UID: \"c3e8a922-2054-46e4-b45e-4c525e08f72f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180378 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-metrics\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180419 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wmf\" (UniqueName: \"kubernetes.io/projected/9bc795a9-e956-4de9-b9e8-98a20a7880af-kube-api-access-n5wmf\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180460 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-conf\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180481 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-startup\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180540 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3e8a922-2054-46e4-b45e-4c525e08f72f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-m6s44\" (UID: \"c3e8a922-2054-46e4-b45e-4c525e08f72f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180580 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-sockets\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180603 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-reloader\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.180726 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9bc795a9-e956-4de9-b9e8-98a20a7880af-metrics-certs\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.191591 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x4fzs" podStartSLOduration=2.760820434 podStartE2EDuration="4.19156991s" podCreationTimestamp="2025-10-03 09:56:53 +0000 UTC" firstStartedPulling="2025-10-03 09:56:55.134553883 +0000 UTC m=+796.931185740" lastFinishedPulling="2025-10-03 09:56:56.565303359 +0000 UTC m=+798.361935216" observedRunningTime="2025-10-03 09:56:57.186440237 +0000 UTC m=+798.983072094" watchObservedRunningTime="2025-10-03 09:56:57.19156991 +0000 UTC m=+798.988201767" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.230099 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jr4nb"] Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.231386 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.233747 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.234205 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.237161 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9vxj6" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.237212 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.250307 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-pjbnl"] Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.251218 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.253480 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.262228 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-pjbnl"] Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282235 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-metrics-certs\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282313 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtrj9\" (UniqueName: \"kubernetes.io/projected/031a0646-6ed0-467a-8042-5e0096fb631d-kube-api-access-gtrj9\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282349 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdrtp\" (UniqueName: \"kubernetes.io/projected/c3e8a922-2054-46e4-b45e-4c525e08f72f-kube-api-access-xdrtp\") pod \"frr-k8s-webhook-server-64bf5d555-m6s44\" (UID: \"c3e8a922-2054-46e4-b45e-4c525e08f72f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282537 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-metrics\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282630 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wmf\" (UniqueName: \"kubernetes.io/projected/9bc795a9-e956-4de9-b9e8-98a20a7880af-kube-api-access-n5wmf\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282666 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-metrics-certs\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282712 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-cert\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282763 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282818 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-conf\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282864 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-startup\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282905 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/031a0646-6ed0-467a-8042-5e0096fb631d-metallb-excludel2\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282952 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3e8a922-2054-46e4-b45e-4c525e08f72f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-m6s44\" (UID: \"c3e8a922-2054-46e4-b45e-4c525e08f72f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282974 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t775\" (UniqueName: \"kubernetes.io/projected/d4754a5f-654a-40cd-999f-355a60ad887b-kube-api-access-5t775\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.282997 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-reloader\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.283017 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-sockets\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.283064 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9bc795a9-e956-4de9-b9e8-98a20a7880af-metrics-certs\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.283212 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-metrics\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.283270 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-conf\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.283535 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-reloader\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.283604 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-sockets\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.284333 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9bc795a9-e956-4de9-b9e8-98a20a7880af-frr-startup\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.301611 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9bc795a9-e956-4de9-b9e8-98a20a7880af-metrics-certs\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.304710 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdrtp\" (UniqueName: \"kubernetes.io/projected/c3e8a922-2054-46e4-b45e-4c525e08f72f-kube-api-access-xdrtp\") pod \"frr-k8s-webhook-server-64bf5d555-m6s44\" (UID: \"c3e8a922-2054-46e4-b45e-4c525e08f72f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.306171 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wmf\" (UniqueName: \"kubernetes.io/projected/9bc795a9-e956-4de9-b9e8-98a20a7880af-kube-api-access-n5wmf\") pod \"frr-k8s-qprvb\" (UID: \"9bc795a9-e956-4de9-b9e8-98a20a7880af\") " pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.307235 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3e8a922-2054-46e4-b45e-4c525e08f72f-cert\") pod \"frr-k8s-webhook-server-64bf5d555-m6s44\" (UID: \"c3e8a922-2054-46e4-b45e-4c525e08f72f\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.384884 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-cert\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.384955 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.384995 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/031a0646-6ed0-467a-8042-5e0096fb631d-metallb-excludel2\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.385035 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t775\" (UniqueName: \"kubernetes.io/projected/d4754a5f-654a-40cd-999f-355a60ad887b-kube-api-access-5t775\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.385073 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-metrics-certs\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.385103 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtrj9\" (UniqueName: \"kubernetes.io/projected/031a0646-6ed0-467a-8042-5e0096fb631d-kube-api-access-gtrj9\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.385141 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-metrics-certs\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: E1003 09:56:57.385308 4990 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 03 09:56:57 crc kubenswrapper[4990]: E1003 09:56:57.385376 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-metrics-certs podName:d4754a5f-654a-40cd-999f-355a60ad887b nodeName:}" failed. No retries permitted until 2025-10-03 09:56:57.885354028 +0000 UTC m=+799.681985885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-metrics-certs") pod "controller-68d546b9d8-pjbnl" (UID: "d4754a5f-654a-40cd-999f-355a60ad887b") : secret "controller-certs-secret" not found Oct 03 09:56:57 crc kubenswrapper[4990]: E1003 09:56:57.385773 4990 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 09:56:57 crc kubenswrapper[4990]: E1003 09:56:57.385803 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist podName:031a0646-6ed0-467a-8042-5e0096fb631d nodeName:}" failed. No retries permitted until 2025-10-03 09:56:57.885795359 +0000 UTC m=+799.682427216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist") pod "speaker-jr4nb" (UID: "031a0646-6ed0-467a-8042-5e0096fb631d") : secret "metallb-memberlist" not found Oct 03 09:56:57 crc kubenswrapper[4990]: E1003 09:56:57.385876 4990 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 03 09:56:57 crc kubenswrapper[4990]: E1003 09:56:57.385958 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-metrics-certs podName:031a0646-6ed0-467a-8042-5e0096fb631d nodeName:}" failed. No retries permitted until 2025-10-03 09:56:57.885939473 +0000 UTC m=+799.682571330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-metrics-certs") pod "speaker-jr4nb" (UID: "031a0646-6ed0-467a-8042-5e0096fb631d") : secret "speaker-certs-secret" not found Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.386586 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/031a0646-6ed0-467a-8042-5e0096fb631d-metallb-excludel2\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.387895 4990 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.400923 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-cert\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.404161 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t775\" (UniqueName: \"kubernetes.io/projected/d4754a5f-654a-40cd-999f-355a60ad887b-kube-api-access-5t775\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.406627 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtrj9\" (UniqueName: \"kubernetes.io/projected/031a0646-6ed0-467a-8042-5e0096fb631d-kube-api-access-gtrj9\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.428678 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.446754 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qprvb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.755881 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44"] Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.894081 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.894182 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-metrics-certs\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:57 crc kubenswrapper[4990]: E1003 09:56:57.894221 4990 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.894239 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-metrics-certs\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: E1003 09:56:57.894295 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist podName:031a0646-6ed0-467a-8042-5e0096fb631d nodeName:}" failed. No retries permitted until 2025-10-03 09:56:58.894274604 +0000 UTC m=+800.690906471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist") pod "speaker-jr4nb" (UID: "031a0646-6ed0-467a-8042-5e0096fb631d") : secret "metallb-memberlist" not found Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.901782 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4754a5f-654a-40cd-999f-355a60ad887b-metrics-certs\") pod \"controller-68d546b9d8-pjbnl\" (UID: \"d4754a5f-654a-40cd-999f-355a60ad887b\") " pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:57 crc kubenswrapper[4990]: I1003 09:56:57.901858 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-metrics-certs\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:58 crc kubenswrapper[4990]: I1003 09:56:58.172600 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerStarted","Data":"f2108ac1974f4a3a8c3912fb4a1bc889e037d4552ea229ba5db748d81c0ba480"} Oct 03 09:56:58 crc kubenswrapper[4990]: I1003 09:56:58.173290 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:58 crc kubenswrapper[4990]: I1003 09:56:58.173606 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" event={"ID":"c3e8a922-2054-46e4-b45e-4c525e08f72f","Type":"ContainerStarted","Data":"059b5a475c3d2c2e7b2a15a1047307a01c80847d6b7455188cd62cf3333ed6a6"} Oct 03 09:56:58 crc kubenswrapper[4990]: I1003 09:56:58.627926 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-pjbnl"] Oct 03 09:56:58 crc kubenswrapper[4990]: W1003 09:56:58.632234 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4754a5f_654a_40cd_999f_355a60ad887b.slice/crio-50fcc24bfd84e71fac1e70b06fa0838d467d93e82adea55f5097b1a657ea8ae8 WatchSource:0}: Error finding container 50fcc24bfd84e71fac1e70b06fa0838d467d93e82adea55f5097b1a657ea8ae8: Status 404 returned error can't find the container with id 50fcc24bfd84e71fac1e70b06fa0838d467d93e82adea55f5097b1a657ea8ae8 Oct 03 09:56:58 crc kubenswrapper[4990]: I1003 09:56:58.911950 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:58 crc kubenswrapper[4990]: I1003 09:56:58.940292 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/031a0646-6ed0-467a-8042-5e0096fb631d-memberlist\") pod \"speaker-jr4nb\" (UID: \"031a0646-6ed0-467a-8042-5e0096fb631d\") " pod="metallb-system/speaker-jr4nb" Oct 03 09:56:59 crc kubenswrapper[4990]: I1003 09:56:59.045826 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jr4nb" Oct 03 09:56:59 crc kubenswrapper[4990]: W1003 09:56:59.098523 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod031a0646_6ed0_467a_8042_5e0096fb631d.slice/crio-bfdb4d99a2763a7352cb7e27b217345d9f42160f1d5da5fca0e7002d7281ed1d WatchSource:0}: Error finding container bfdb4d99a2763a7352cb7e27b217345d9f42160f1d5da5fca0e7002d7281ed1d: Status 404 returned error can't find the container with id bfdb4d99a2763a7352cb7e27b217345d9f42160f1d5da5fca0e7002d7281ed1d Oct 03 09:56:59 crc kubenswrapper[4990]: I1003 09:56:59.189581 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-pjbnl" event={"ID":"d4754a5f-654a-40cd-999f-355a60ad887b","Type":"ContainerStarted","Data":"12418abc802636b9b156f4c941f2f6cd93085eac61171924638431e063b89342"} Oct 03 09:56:59 crc kubenswrapper[4990]: I1003 09:56:59.189637 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-pjbnl" event={"ID":"d4754a5f-654a-40cd-999f-355a60ad887b","Type":"ContainerStarted","Data":"edc11a397018088b097d03c22f122695177587a158783dfcaab403e1d25638c0"} Oct 03 09:56:59 crc kubenswrapper[4990]: I1003 09:56:59.189647 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-pjbnl" event={"ID":"d4754a5f-654a-40cd-999f-355a60ad887b","Type":"ContainerStarted","Data":"50fcc24bfd84e71fac1e70b06fa0838d467d93e82adea55f5097b1a657ea8ae8"} Oct 03 09:56:59 crc kubenswrapper[4990]: I1003 09:56:59.190660 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:56:59 crc kubenswrapper[4990]: I1003 09:56:59.192104 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jr4nb" event={"ID":"031a0646-6ed0-467a-8042-5e0096fb631d","Type":"ContainerStarted","Data":"bfdb4d99a2763a7352cb7e27b217345d9f42160f1d5da5fca0e7002d7281ed1d"} Oct 03 09:57:00 crc kubenswrapper[4990]: I1003 09:57:00.204527 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jr4nb" event={"ID":"031a0646-6ed0-467a-8042-5e0096fb631d","Type":"ContainerStarted","Data":"1b02e8803ff7cadbf1c716becb1ad5e0d820afa2e6bf047bfb5084c5a2d32625"} Oct 03 09:57:00 crc kubenswrapper[4990]: I1003 09:57:00.204892 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jr4nb" event={"ID":"031a0646-6ed0-467a-8042-5e0096fb631d","Type":"ContainerStarted","Data":"5ac92111b6022b314d21d1cdf03bd817381538795a25b4d892e0309f8831ea45"} Oct 03 09:57:00 crc kubenswrapper[4990]: I1003 09:57:00.204916 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jr4nb" Oct 03 09:57:00 crc kubenswrapper[4990]: I1003 09:57:00.229535 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-pjbnl" podStartSLOduration=3.229494012 podStartE2EDuration="3.229494012s" podCreationTimestamp="2025-10-03 09:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:56:59.2051034 +0000 UTC m=+801.001735277" watchObservedRunningTime="2025-10-03 09:57:00.229494012 +0000 UTC m=+802.026125869" Oct 03 09:57:00 crc kubenswrapper[4990]: I1003 09:57:00.233023 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jr4nb" podStartSLOduration=3.233005634 podStartE2EDuration="3.233005634s" podCreationTimestamp="2025-10-03 09:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:57:00.227740457 +0000 UTC m=+802.024372324" watchObservedRunningTime="2025-10-03 09:57:00.233005634 +0000 UTC m=+802.029637491" Oct 03 09:57:02 crc kubenswrapper[4990]: I1003 09:57:02.105292 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:57:02 crc kubenswrapper[4990]: I1003 09:57:02.105759 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:57:02 crc kubenswrapper[4990]: I1003 09:57:02.152150 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:57:02 crc kubenswrapper[4990]: I1003 09:57:02.269627 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:57:02 crc kubenswrapper[4990]: I1003 09:57:02.386767 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7n7ws"] Oct 03 09:57:04 crc kubenswrapper[4990]: I1003 09:57:04.226577 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7n7ws" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerName="registry-server" containerID="cri-o://ee8d484ae9227be525582162b70f3fe0a7f5cdf3665d3af0b5711a28f085b9b4" gracePeriod=2 Oct 03 09:57:04 crc kubenswrapper[4990]: I1003 09:57:04.337748 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:57:04 crc kubenswrapper[4990]: I1003 09:57:04.337814 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:57:04 crc kubenswrapper[4990]: I1003 09:57:04.386223 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.233330 4990 generic.go:334] "Generic (PLEG): container finished" podID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerID="ee8d484ae9227be525582162b70f3fe0a7f5cdf3665d3af0b5711a28f085b9b4" exitCode=0 Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.234950 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7n7ws" event={"ID":"1cee26c5-8852-4b95-b4df-4fb612977d54","Type":"ContainerDied","Data":"ee8d484ae9227be525582162b70f3fe0a7f5cdf3665d3af0b5711a28f085b9b4"} Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.291860 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.295455 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.403143 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-849q5\" (UniqueName: \"kubernetes.io/projected/1cee26c5-8852-4b95-b4df-4fb612977d54-kube-api-access-849q5\") pod \"1cee26c5-8852-4b95-b4df-4fb612977d54\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.403233 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-catalog-content\") pod \"1cee26c5-8852-4b95-b4df-4fb612977d54\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.403383 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-utilities\") pod \"1cee26c5-8852-4b95-b4df-4fb612977d54\" (UID: \"1cee26c5-8852-4b95-b4df-4fb612977d54\") " Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.404379 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-utilities" (OuterVolumeSpecName: "utilities") pod "1cee26c5-8852-4b95-b4df-4fb612977d54" (UID: "1cee26c5-8852-4b95-b4df-4fb612977d54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.410286 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cee26c5-8852-4b95-b4df-4fb612977d54-kube-api-access-849q5" (OuterVolumeSpecName: "kube-api-access-849q5") pod "1cee26c5-8852-4b95-b4df-4fb612977d54" (UID: "1cee26c5-8852-4b95-b4df-4fb612977d54"). InnerVolumeSpecName "kube-api-access-849q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.417907 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cee26c5-8852-4b95-b4df-4fb612977d54" (UID: "1cee26c5-8852-4b95-b4df-4fb612977d54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.505734 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.505788 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-849q5\" (UniqueName: \"kubernetes.io/projected/1cee26c5-8852-4b95-b4df-4fb612977d54-kube-api-access-849q5\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:05 crc kubenswrapper[4990]: I1003 09:57:05.505813 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cee26c5-8852-4b95-b4df-4fb612977d54-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.242249 4990 generic.go:334] "Generic (PLEG): container finished" podID="9bc795a9-e956-4de9-b9e8-98a20a7880af" containerID="7e60f256a27a0014cb9335f53208efefb582abde4202a253a15b3bc6021e03c8" exitCode=0 Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.243321 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerDied","Data":"7e60f256a27a0014cb9335f53208efefb582abde4202a253a15b3bc6021e03c8"} Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.245827 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" event={"ID":"c3e8a922-2054-46e4-b45e-4c525e08f72f","Type":"ContainerStarted","Data":"89ab9721e202fac8c1403d4ffae528ced81ef3f61d66c5bd536f83cd149f0103"} Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.245983 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.248923 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7n7ws" event={"ID":"1cee26c5-8852-4b95-b4df-4fb612977d54","Type":"ContainerDied","Data":"46ac8038b6b40c348cc165a23b76b5d9cfb2aa1eec6aa699e708a15bd17dc279"} Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.249013 4990 scope.go:117] "RemoveContainer" containerID="ee8d484ae9227be525582162b70f3fe0a7f5cdf3665d3af0b5711a28f085b9b4" Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.248955 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7n7ws" Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.274804 4990 scope.go:117] "RemoveContainer" containerID="2c673cb891335fe42611f9cc8c5b11af3ccfb13f4dc67d070c1c43deb68c9190" Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.286982 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" podStartSLOduration=1.9955715330000001 podStartE2EDuration="9.286964651s" podCreationTimestamp="2025-10-03 09:56:57 +0000 UTC" firstStartedPulling="2025-10-03 09:56:57.775389739 +0000 UTC m=+799.572021606" lastFinishedPulling="2025-10-03 09:57:05.066782837 +0000 UTC m=+806.863414724" observedRunningTime="2025-10-03 09:57:06.284670781 +0000 UTC m=+808.081302638" watchObservedRunningTime="2025-10-03 09:57:06.286964651 +0000 UTC m=+808.083596508" Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.306120 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7n7ws"] Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.313014 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7n7ws"] Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.314762 4990 scope.go:117] "RemoveContainer" containerID="c255bca54619df14a077bb05204ccebcc5bfa409e0f9c0f9ef16dcbebb1c192b" Oct 03 09:57:06 crc kubenswrapper[4990]: I1003 09:57:06.879778 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" path="/var/lib/kubelet/pods/1cee26c5-8852-4b95-b4df-4fb612977d54/volumes" Oct 03 09:57:07 crc kubenswrapper[4990]: I1003 09:57:07.188023 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x4fzs"] Oct 03 09:57:07 crc kubenswrapper[4990]: I1003 09:57:07.260897 4990 generic.go:334] "Generic (PLEG): container finished" podID="9bc795a9-e956-4de9-b9e8-98a20a7880af" containerID="b83c94fee5f89a05676fcc4a3dd9cec5e40f3a39d2db06b0a02e3ba93ef1fea3" exitCode=0 Oct 03 09:57:07 crc kubenswrapper[4990]: I1003 09:57:07.260970 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerDied","Data":"b83c94fee5f89a05676fcc4a3dd9cec5e40f3a39d2db06b0a02e3ba93ef1fea3"} Oct 03 09:57:07 crc kubenswrapper[4990]: I1003 09:57:07.262873 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x4fzs" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerName="registry-server" containerID="cri-o://07e2e19978fe2628ac8a1e7ad64932fbeca11159cb5c38a9ac913a5716a285da" gracePeriod=2 Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.177943 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-pjbnl" Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.268969 4990 generic.go:334] "Generic (PLEG): container finished" podID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerID="07e2e19978fe2628ac8a1e7ad64932fbeca11159cb5c38a9ac913a5716a285da" exitCode=0 Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.269024 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4fzs" event={"ID":"f10b3a78-3252-4460-9e4a-e79afeb681d1","Type":"ContainerDied","Data":"07e2e19978fe2628ac8a1e7ad64932fbeca11159cb5c38a9ac913a5716a285da"} Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.269051 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x4fzs" event={"ID":"f10b3a78-3252-4460-9e4a-e79afeb681d1","Type":"ContainerDied","Data":"59700c9d8458dc4fca5bd8f1407904f8be4dae4d83804477e0e85ab470c526c6"} Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.269066 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59700c9d8458dc4fca5bd8f1407904f8be4dae4d83804477e0e85ab470c526c6" Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.271067 4990 generic.go:334] "Generic (PLEG): container finished" podID="9bc795a9-e956-4de9-b9e8-98a20a7880af" containerID="66f7d85cb93391e096431e2c9996f130d1206c49f63cf89c6e8e9e26b8d6e00e" exitCode=0 Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.271092 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerDied","Data":"66f7d85cb93391e096431e2c9996f130d1206c49f63cf89c6e8e9e26b8d6e00e"} Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.316868 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.341304 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-catalog-content\") pod \"f10b3a78-3252-4460-9e4a-e79afeb681d1\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.341826 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bkdw\" (UniqueName: \"kubernetes.io/projected/f10b3a78-3252-4460-9e4a-e79afeb681d1-kube-api-access-7bkdw\") pod \"f10b3a78-3252-4460-9e4a-e79afeb681d1\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.341857 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-utilities\") pod \"f10b3a78-3252-4460-9e4a-e79afeb681d1\" (UID: \"f10b3a78-3252-4460-9e4a-e79afeb681d1\") " Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.342857 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-utilities" (OuterVolumeSpecName: "utilities") pod "f10b3a78-3252-4460-9e4a-e79afeb681d1" (UID: "f10b3a78-3252-4460-9e4a-e79afeb681d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.349906 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10b3a78-3252-4460-9e4a-e79afeb681d1-kube-api-access-7bkdw" (OuterVolumeSpecName: "kube-api-access-7bkdw") pod "f10b3a78-3252-4460-9e4a-e79afeb681d1" (UID: "f10b3a78-3252-4460-9e4a-e79afeb681d1"). InnerVolumeSpecName "kube-api-access-7bkdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.417496 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f10b3a78-3252-4460-9e4a-e79afeb681d1" (UID: "f10b3a78-3252-4460-9e4a-e79afeb681d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.444064 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bkdw\" (UniqueName: \"kubernetes.io/projected/f10b3a78-3252-4460-9e4a-e79afeb681d1-kube-api-access-7bkdw\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.444137 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:08 crc kubenswrapper[4990]: I1003 09:57:08.444172 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10b3a78-3252-4460-9e4a-e79afeb681d1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.052132 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jr4nb" Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.282013 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerStarted","Data":"438675592b6e189097dbc9b6ddc69bf583aea4b5b1d314bb003f39c23a9f823d"} Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.282470 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerStarted","Data":"1aa426c1132a0d2c2fe785b92ee0a40250b501032ec5a56858d95cae9c34eb22"} Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.282491 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerStarted","Data":"456d722024f2c2a17808084ab7c6eb6ac0dc61c119dbbeaa8b5bed903cace0b8"} Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.282500 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerStarted","Data":"c0b9b89193210c302d60efebd669413adc0a9f51d372c6d12e6c9dcd95547e35"} Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.282063 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x4fzs" Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.282523 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerStarted","Data":"e4b48afb36555f62ca02f1870ed3fa003abaf97e2f4f8573925931da93e4b7a3"} Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.307802 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x4fzs"] Oct 03 09:57:09 crc kubenswrapper[4990]: I1003 09:57:09.311836 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x4fzs"] Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.294906 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qprvb" event={"ID":"9bc795a9-e956-4de9-b9e8-98a20a7880af","Type":"ContainerStarted","Data":"bfb73f7c60b74e063d1784664234c0407926d9f69ef195ea2babe0334b0bcabf"} Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.295188 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-qprvb" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.323573 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-qprvb" podStartSLOduration=5.861593004 podStartE2EDuration="13.323549598s" podCreationTimestamp="2025-10-03 09:56:57 +0000 UTC" firstStartedPulling="2025-10-03 09:56:57.636040203 +0000 UTC m=+799.432672060" lastFinishedPulling="2025-10-03 09:57:05.097996797 +0000 UTC m=+806.894628654" observedRunningTime="2025-10-03 09:57:10.320393386 +0000 UTC m=+812.117025273" watchObservedRunningTime="2025-10-03 09:57:10.323549598 +0000 UTC m=+812.120181465" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.835289 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2"] Oct 03 09:57:10 crc kubenswrapper[4990]: E1003 09:57:10.835916 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerName="registry-server" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.835932 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerName="registry-server" Oct 03 09:57:10 crc kubenswrapper[4990]: E1003 09:57:10.835942 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerName="registry-server" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.835949 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerName="registry-server" Oct 03 09:57:10 crc kubenswrapper[4990]: E1003 09:57:10.835965 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerName="extract-utilities" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.835973 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerName="extract-utilities" Oct 03 09:57:10 crc kubenswrapper[4990]: E1003 09:57:10.835991 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerName="extract-content" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.835999 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerName="extract-content" Oct 03 09:57:10 crc kubenswrapper[4990]: E1003 09:57:10.836015 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerName="extract-utilities" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.836024 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerName="extract-utilities" Oct 03 09:57:10 crc kubenswrapper[4990]: E1003 09:57:10.836033 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerName="extract-content" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.836041 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerName="extract-content" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.836178 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" containerName="registry-server" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.836198 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cee26c5-8852-4b95-b4df-4fb612977d54" containerName="registry-server" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.837048 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.838771 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.848487 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2"] Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.875615 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.875742 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.875841 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rrgl\" (UniqueName: \"kubernetes.io/projected/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-kube-api-access-9rrgl\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.881057 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10b3a78-3252-4460-9e4a-e79afeb681d1" path="/var/lib/kubelet/pods/f10b3a78-3252-4460-9e4a-e79afeb681d1/volumes" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.977380 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.977459 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rrgl\" (UniqueName: \"kubernetes.io/projected/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-kube-api-access-9rrgl\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.977522 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.978023 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:10 crc kubenswrapper[4990]: I1003 09:57:10.978171 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:11 crc kubenswrapper[4990]: I1003 09:57:11.001840 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rrgl\" (UniqueName: \"kubernetes.io/projected/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-kube-api-access-9rrgl\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:11 crc kubenswrapper[4990]: I1003 09:57:11.155718 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:11 crc kubenswrapper[4990]: I1003 09:57:11.572103 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2"] Oct 03 09:57:11 crc kubenswrapper[4990]: W1003 09:57:11.577265 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86384ae4_ecae_44dc_9ea7_80a4f19ed2c3.slice/crio-9addcea18e98e9ec1741557ab647e99a64692cfa4e71950a5339862e82f5e260 WatchSource:0}: Error finding container 9addcea18e98e9ec1741557ab647e99a64692cfa4e71950a5339862e82f5e260: Status 404 returned error can't find the container with id 9addcea18e98e9ec1741557ab647e99a64692cfa4e71950a5339862e82f5e260 Oct 03 09:57:12 crc kubenswrapper[4990]: I1003 09:57:12.318730 4990 generic.go:334] "Generic (PLEG): container finished" podID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerID="b53da6a5734ee16844d68f3ce46fbf65b69fbc62b66a1c341df7bc99436e6a5c" exitCode=0 Oct 03 09:57:12 crc kubenswrapper[4990]: I1003 09:57:12.318853 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" event={"ID":"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3","Type":"ContainerDied","Data":"b53da6a5734ee16844d68f3ce46fbf65b69fbc62b66a1c341df7bc99436e6a5c"} Oct 03 09:57:12 crc kubenswrapper[4990]: I1003 09:57:12.319094 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" event={"ID":"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3","Type":"ContainerStarted","Data":"9addcea18e98e9ec1741557ab647e99a64692cfa4e71950a5339862e82f5e260"} Oct 03 09:57:12 crc kubenswrapper[4990]: I1003 09:57:12.447289 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-qprvb" Oct 03 09:57:12 crc kubenswrapper[4990]: I1003 09:57:12.488018 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-qprvb" Oct 03 09:57:16 crc kubenswrapper[4990]: I1003 09:57:16.354845 4990 generic.go:334] "Generic (PLEG): container finished" podID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerID="59afdbd18053f8a6efd225a6bcdc22cac2bfe27852f2b5fd7f0203fbdf0bbed4" exitCode=0 Oct 03 09:57:16 crc kubenswrapper[4990]: I1003 09:57:16.354908 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" event={"ID":"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3","Type":"ContainerDied","Data":"59afdbd18053f8a6efd225a6bcdc22cac2bfe27852f2b5fd7f0203fbdf0bbed4"} Oct 03 09:57:17 crc kubenswrapper[4990]: I1003 09:57:17.376183 4990 generic.go:334] "Generic (PLEG): container finished" podID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerID="1473d68bfd92f1f0568c6d45a227e6480cb062650ea01ea147b713c3b94880d9" exitCode=0 Oct 03 09:57:17 crc kubenswrapper[4990]: I1003 09:57:17.376715 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" event={"ID":"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3","Type":"ContainerDied","Data":"1473d68bfd92f1f0568c6d45a227e6480cb062650ea01ea147b713c3b94880d9"} Oct 03 09:57:17 crc kubenswrapper[4990]: I1003 09:57:17.437838 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-m6s44" Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.681606 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.787790 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-util\") pod \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.787861 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rrgl\" (UniqueName: \"kubernetes.io/projected/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-kube-api-access-9rrgl\") pod \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.787890 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-bundle\") pod \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\" (UID: \"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3\") " Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.789146 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-bundle" (OuterVolumeSpecName: "bundle") pod "86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" (UID: "86384ae4-ecae-44dc-9ea7-80a4f19ed2c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.795338 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-kube-api-access-9rrgl" (OuterVolumeSpecName: "kube-api-access-9rrgl") pod "86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" (UID: "86384ae4-ecae-44dc-9ea7-80a4f19ed2c3"). InnerVolumeSpecName "kube-api-access-9rrgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.806852 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-util" (OuterVolumeSpecName: "util") pod "86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" (UID: "86384ae4-ecae-44dc-9ea7-80a4f19ed2c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.889845 4990 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-util\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.889886 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rrgl\" (UniqueName: \"kubernetes.io/projected/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-kube-api-access-9rrgl\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:18 crc kubenswrapper[4990]: I1003 09:57:18.889903 4990 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86384ae4-ecae-44dc-9ea7-80a4f19ed2c3-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:19 crc kubenswrapper[4990]: I1003 09:57:19.392042 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" event={"ID":"86384ae4-ecae-44dc-9ea7-80a4f19ed2c3","Type":"ContainerDied","Data":"9addcea18e98e9ec1741557ab647e99a64692cfa4e71950a5339862e82f5e260"} Oct 03 09:57:19 crc kubenswrapper[4990]: I1003 09:57:19.392488 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9addcea18e98e9ec1741557ab647e99a64692cfa4e71950a5339862e82f5e260" Oct 03 09:57:19 crc kubenswrapper[4990]: I1003 09:57:19.392110 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.281495 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d"] Oct 03 09:57:23 crc kubenswrapper[4990]: E1003 09:57:23.282188 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerName="extract" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.282202 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerName="extract" Oct 03 09:57:23 crc kubenswrapper[4990]: E1003 09:57:23.282218 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerName="pull" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.282224 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerName="pull" Oct 03 09:57:23 crc kubenswrapper[4990]: E1003 09:57:23.282232 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerName="util" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.282238 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerName="util" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.282341 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="86384ae4-ecae-44dc-9ea7-80a4f19ed2c3" containerName="extract" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.282803 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.286555 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.286686 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.287059 4990 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-9mmf7" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.305927 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d"] Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.352017 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgwrs\" (UniqueName: \"kubernetes.io/projected/e3140791-5554-4a14-b166-bac678c4bfdc-kube-api-access-jgwrs\") pod \"cert-manager-operator-controller-manager-57cd46d6d-wkk2d\" (UID: \"e3140791-5554-4a14-b166-bac678c4bfdc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.454150 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgwrs\" (UniqueName: \"kubernetes.io/projected/e3140791-5554-4a14-b166-bac678c4bfdc-kube-api-access-jgwrs\") pod \"cert-manager-operator-controller-manager-57cd46d6d-wkk2d\" (UID: \"e3140791-5554-4a14-b166-bac678c4bfdc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.480863 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgwrs\" (UniqueName: \"kubernetes.io/projected/e3140791-5554-4a14-b166-bac678c4bfdc-kube-api-access-jgwrs\") pod \"cert-manager-operator-controller-manager-57cd46d6d-wkk2d\" (UID: \"e3140791-5554-4a14-b166-bac678c4bfdc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d" Oct 03 09:57:23 crc kubenswrapper[4990]: I1003 09:57:23.607942 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d" Oct 03 09:57:24 crc kubenswrapper[4990]: I1003 09:57:24.021208 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d"] Oct 03 09:57:24 crc kubenswrapper[4990]: W1003 09:57:24.026867 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3140791_5554_4a14_b166_bac678c4bfdc.slice/crio-3bd4613efe6a5931e28a21f82ef79502c697f72c0cb499bb34c2c57c4b5a7dc6 WatchSource:0}: Error finding container 3bd4613efe6a5931e28a21f82ef79502c697f72c0cb499bb34c2c57c4b5a7dc6: Status 404 returned error can't find the container with id 3bd4613efe6a5931e28a21f82ef79502c697f72c0cb499bb34c2c57c4b5a7dc6 Oct 03 09:57:24 crc kubenswrapper[4990]: I1003 09:57:24.431539 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d" event={"ID":"e3140791-5554-4a14-b166-bac678c4bfdc","Type":"ContainerStarted","Data":"3bd4613efe6a5931e28a21f82ef79502c697f72c0cb499bb34c2c57c4b5a7dc6"} Oct 03 09:57:27 crc kubenswrapper[4990]: I1003 09:57:27.457662 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-qprvb" Oct 03 09:57:32 crc kubenswrapper[4990]: I1003 09:57:32.524930 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d" event={"ID":"e3140791-5554-4a14-b166-bac678c4bfdc","Type":"ContainerStarted","Data":"e8c1617fae88c1334aa27993c2e13daf70c47426c7a512f05d2170ed5ad50761"} Oct 03 09:57:32 crc kubenswrapper[4990]: I1003 09:57:32.565136 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-wkk2d" podStartSLOduration=2.157960365 podStartE2EDuration="9.565118548s" podCreationTimestamp="2025-10-03 09:57:23 +0000 UTC" firstStartedPulling="2025-10-03 09:57:24.02905899 +0000 UTC m=+825.825690847" lastFinishedPulling="2025-10-03 09:57:31.436217173 +0000 UTC m=+833.232849030" observedRunningTime="2025-10-03 09:57:32.562239574 +0000 UTC m=+834.358871431" watchObservedRunningTime="2025-10-03 09:57:32.565118548 +0000 UTC m=+834.361750395" Oct 03 09:57:34 crc kubenswrapper[4990]: I1003 09:57:34.971032 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-6ddvk"] Oct 03 09:57:34 crc kubenswrapper[4990]: I1003 09:57:34.972389 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:34 crc kubenswrapper[4990]: I1003 09:57:34.974335 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 09:57:34 crc kubenswrapper[4990]: I1003 09:57:34.974553 4990 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jf2p4" Oct 03 09:57:34 crc kubenswrapper[4990]: I1003 09:57:34.974854 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 09:57:34 crc kubenswrapper[4990]: I1003 09:57:34.987246 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-6ddvk"] Oct 03 09:57:35 crc kubenswrapper[4990]: I1003 09:57:35.117090 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c72b\" (UniqueName: \"kubernetes.io/projected/e0a67b45-9f3c-4dac-ab06-c740fa33ee97-kube-api-access-8c72b\") pod \"cert-manager-webhook-d969966f-6ddvk\" (UID: \"e0a67b45-9f3c-4dac-ab06-c740fa33ee97\") " pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:35 crc kubenswrapper[4990]: I1003 09:57:35.117393 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0a67b45-9f3c-4dac-ab06-c740fa33ee97-bound-sa-token\") pod \"cert-manager-webhook-d969966f-6ddvk\" (UID: \"e0a67b45-9f3c-4dac-ab06-c740fa33ee97\") " pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:35 crc kubenswrapper[4990]: I1003 09:57:35.219000 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0a67b45-9f3c-4dac-ab06-c740fa33ee97-bound-sa-token\") pod \"cert-manager-webhook-d969966f-6ddvk\" (UID: \"e0a67b45-9f3c-4dac-ab06-c740fa33ee97\") " pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:35 crc kubenswrapper[4990]: I1003 09:57:35.219074 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c72b\" (UniqueName: \"kubernetes.io/projected/e0a67b45-9f3c-4dac-ab06-c740fa33ee97-kube-api-access-8c72b\") pod \"cert-manager-webhook-d969966f-6ddvk\" (UID: \"e0a67b45-9f3c-4dac-ab06-c740fa33ee97\") " pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:35 crc kubenswrapper[4990]: I1003 09:57:35.238918 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0a67b45-9f3c-4dac-ab06-c740fa33ee97-bound-sa-token\") pod \"cert-manager-webhook-d969966f-6ddvk\" (UID: \"e0a67b45-9f3c-4dac-ab06-c740fa33ee97\") " pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:35 crc kubenswrapper[4990]: I1003 09:57:35.241527 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c72b\" (UniqueName: \"kubernetes.io/projected/e0a67b45-9f3c-4dac-ab06-c740fa33ee97-kube-api-access-8c72b\") pod \"cert-manager-webhook-d969966f-6ddvk\" (UID: \"e0a67b45-9f3c-4dac-ab06-c740fa33ee97\") " pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:35 crc kubenswrapper[4990]: I1003 09:57:35.287970 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:35 crc kubenswrapper[4990]: I1003 09:57:35.723124 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-6ddvk"] Oct 03 09:57:35 crc kubenswrapper[4990]: W1003 09:57:35.728336 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a67b45_9f3c_4dac_ab06_c740fa33ee97.slice/crio-32d95c1ad42254d43cb0a1a3f89c630ffbfd15af0f39e45a31abc4faa87a6b43 WatchSource:0}: Error finding container 32d95c1ad42254d43cb0a1a3f89c630ffbfd15af0f39e45a31abc4faa87a6b43: Status 404 returned error can't find the container with id 32d95c1ad42254d43cb0a1a3f89c630ffbfd15af0f39e45a31abc4faa87a6b43 Oct 03 09:57:36 crc kubenswrapper[4990]: I1003 09:57:36.550627 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" event={"ID":"e0a67b45-9f3c-4dac-ab06-c740fa33ee97","Type":"ContainerStarted","Data":"32d95c1ad42254d43cb0a1a3f89c630ffbfd15af0f39e45a31abc4faa87a6b43"} Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.353051 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6"] Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.357346 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.362162 4990 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-ft48z" Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.365229 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6"] Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.462149 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9b8600d-5569-4df7-ac48-0925c3d9c0b9-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-j9hm6\" (UID: \"a9b8600d-5569-4df7-ac48-0925c3d9c0b9\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.462189 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm4gx\" (UniqueName: \"kubernetes.io/projected/a9b8600d-5569-4df7-ac48-0925c3d9c0b9-kube-api-access-zm4gx\") pod \"cert-manager-cainjector-7d9f95dbf-j9hm6\" (UID: \"a9b8600d-5569-4df7-ac48-0925c3d9c0b9\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.562982 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9b8600d-5569-4df7-ac48-0925c3d9c0b9-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-j9hm6\" (UID: \"a9b8600d-5569-4df7-ac48-0925c3d9c0b9\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.563033 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm4gx\" (UniqueName: \"kubernetes.io/projected/a9b8600d-5569-4df7-ac48-0925c3d9c0b9-kube-api-access-zm4gx\") pod \"cert-manager-cainjector-7d9f95dbf-j9hm6\" (UID: \"a9b8600d-5569-4df7-ac48-0925c3d9c0b9\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.582974 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9b8600d-5569-4df7-ac48-0925c3d9c0b9-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-j9hm6\" (UID: \"a9b8600d-5569-4df7-ac48-0925c3d9c0b9\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.583369 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm4gx\" (UniqueName: \"kubernetes.io/projected/a9b8600d-5569-4df7-ac48-0925c3d9c0b9-kube-api-access-zm4gx\") pod \"cert-manager-cainjector-7d9f95dbf-j9hm6\" (UID: \"a9b8600d-5569-4df7-ac48-0925c3d9c0b9\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" Oct 03 09:57:38 crc kubenswrapper[4990]: I1003 09:57:38.697898 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" Oct 03 09:57:40 crc kubenswrapper[4990]: I1003 09:57:40.306678 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6"] Oct 03 09:57:40 crc kubenswrapper[4990]: I1003 09:57:40.587971 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" event={"ID":"e0a67b45-9f3c-4dac-ab06-c740fa33ee97","Type":"ContainerStarted","Data":"a158d1447f768a32887728f17f371f96ad9a6f33a34802a4f5c996e51cd581d2"} Oct 03 09:57:40 crc kubenswrapper[4990]: I1003 09:57:40.588111 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:40 crc kubenswrapper[4990]: I1003 09:57:40.589755 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" event={"ID":"a9b8600d-5569-4df7-ac48-0925c3d9c0b9","Type":"ContainerStarted","Data":"75512aca36fea2c3a21d8ab15f0a79cea5ead2a6dc4164b3005cc355c6da77ed"} Oct 03 09:57:40 crc kubenswrapper[4990]: I1003 09:57:40.589789 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" event={"ID":"a9b8600d-5569-4df7-ac48-0925c3d9c0b9","Type":"ContainerStarted","Data":"c9cdb8a0e5883217b7e945cff1168713e1f0a1b825d9a54f868c35a226e05270"} Oct 03 09:57:40 crc kubenswrapper[4990]: I1003 09:57:40.611707 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" podStartSLOduration=2.3581784949999998 podStartE2EDuration="6.611678892s" podCreationTimestamp="2025-10-03 09:57:34 +0000 UTC" firstStartedPulling="2025-10-03 09:57:35.73099983 +0000 UTC m=+837.527631697" lastFinishedPulling="2025-10-03 09:57:39.984500237 +0000 UTC m=+841.781132094" observedRunningTime="2025-10-03 09:57:40.60506912 +0000 UTC m=+842.401700987" watchObservedRunningTime="2025-10-03 09:57:40.611678892 +0000 UTC m=+842.408310759" Oct 03 09:57:40 crc kubenswrapper[4990]: I1003 09:57:40.631744 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-j9hm6" podStartSLOduration=2.631726442 podStartE2EDuration="2.631726442s" podCreationTimestamp="2025-10-03 09:57:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:57:40.628600451 +0000 UTC m=+842.425232328" watchObservedRunningTime="2025-10-03 09:57:40.631726442 +0000 UTC m=+842.428358299" Oct 03 09:57:45 crc kubenswrapper[4990]: I1003 09:57:45.292222 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-6ddvk" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.531805 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-gp2bd"] Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.535116 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.538202 4990 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8692k" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.558984 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-gp2bd"] Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.590023 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6d0cdc1-a2f0-45a9-89d4-81903991fcd2-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-gp2bd\" (UID: \"b6d0cdc1-a2f0-45a9-89d4-81903991fcd2\") " pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.590104 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrkp\" (UniqueName: \"kubernetes.io/projected/b6d0cdc1-a2f0-45a9-89d4-81903991fcd2-kube-api-access-qxrkp\") pod \"cert-manager-7d4cc89fcb-gp2bd\" (UID: \"b6d0cdc1-a2f0-45a9-89d4-81903991fcd2\") " pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.692183 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6d0cdc1-a2f0-45a9-89d4-81903991fcd2-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-gp2bd\" (UID: \"b6d0cdc1-a2f0-45a9-89d4-81903991fcd2\") " pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.692250 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrkp\" (UniqueName: \"kubernetes.io/projected/b6d0cdc1-a2f0-45a9-89d4-81903991fcd2-kube-api-access-qxrkp\") pod \"cert-manager-7d4cc89fcb-gp2bd\" (UID: \"b6d0cdc1-a2f0-45a9-89d4-81903991fcd2\") " pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.728021 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b6d0cdc1-a2f0-45a9-89d4-81903991fcd2-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-gp2bd\" (UID: \"b6d0cdc1-a2f0-45a9-89d4-81903991fcd2\") " pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.728588 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrkp\" (UniqueName: \"kubernetes.io/projected/b6d0cdc1-a2f0-45a9-89d4-81903991fcd2-kube-api-access-qxrkp\") pod \"cert-manager-7d4cc89fcb-gp2bd\" (UID: \"b6d0cdc1-a2f0-45a9-89d4-81903991fcd2\") " pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" Oct 03 09:57:54 crc kubenswrapper[4990]: I1003 09:57:54.861181 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" Oct 03 09:57:55 crc kubenswrapper[4990]: I1003 09:57:55.192900 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-gp2bd"] Oct 03 09:57:55 crc kubenswrapper[4990]: W1003 09:57:55.204255 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6d0cdc1_a2f0_45a9_89d4_81903991fcd2.slice/crio-424150bcd160fc7ba1e3dae3a95c2d8a4e496e58147040038f174530ae6aa0f3 WatchSource:0}: Error finding container 424150bcd160fc7ba1e3dae3a95c2d8a4e496e58147040038f174530ae6aa0f3: Status 404 returned error can't find the container with id 424150bcd160fc7ba1e3dae3a95c2d8a4e496e58147040038f174530ae6aa0f3 Oct 03 09:57:55 crc kubenswrapper[4990]: I1003 09:57:55.689570 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" event={"ID":"b6d0cdc1-a2f0-45a9-89d4-81903991fcd2","Type":"ContainerStarted","Data":"a6c364e1038452b7111b7009c5a71b198f3f217eb49ab2006162f284ff5c143a"} Oct 03 09:57:55 crc kubenswrapper[4990]: I1003 09:57:55.691104 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" event={"ID":"b6d0cdc1-a2f0-45a9-89d4-81903991fcd2","Type":"ContainerStarted","Data":"424150bcd160fc7ba1e3dae3a95c2d8a4e496e58147040038f174530ae6aa0f3"} Oct 03 09:57:55 crc kubenswrapper[4990]: I1003 09:57:55.705943 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-gp2bd" podStartSLOduration=1.7059040049999998 podStartE2EDuration="1.705904005s" podCreationTimestamp="2025-10-03 09:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:57:55.705336991 +0000 UTC m=+857.501968888" watchObservedRunningTime="2025-10-03 09:57:55.705904005 +0000 UTC m=+857.502535912" Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.512351 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-s25zf"] Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.513806 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s25zf" Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.521475 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-b825r" Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.521905 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.522301 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.528768 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s25zf"] Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.556806 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nv27\" (UniqueName: \"kubernetes.io/projected/f34b1866-2113-448d-8265-2c6577f170f0-kube-api-access-6nv27\") pod \"openstack-operator-index-s25zf\" (UID: \"f34b1866-2113-448d-8265-2c6577f170f0\") " pod="openstack-operators/openstack-operator-index-s25zf" Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.658692 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nv27\" (UniqueName: \"kubernetes.io/projected/f34b1866-2113-448d-8265-2c6577f170f0-kube-api-access-6nv27\") pod \"openstack-operator-index-s25zf\" (UID: \"f34b1866-2113-448d-8265-2c6577f170f0\") " pod="openstack-operators/openstack-operator-index-s25zf" Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.680834 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nv27\" (UniqueName: \"kubernetes.io/projected/f34b1866-2113-448d-8265-2c6577f170f0-kube-api-access-6nv27\") pod \"openstack-operator-index-s25zf\" (UID: \"f34b1866-2113-448d-8265-2c6577f170f0\") " pod="openstack-operators/openstack-operator-index-s25zf" Oct 03 09:57:58 crc kubenswrapper[4990]: I1003 09:57:58.847833 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s25zf" Oct 03 09:57:59 crc kubenswrapper[4990]: I1003 09:57:59.338068 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s25zf"] Oct 03 09:57:59 crc kubenswrapper[4990]: W1003 09:57:59.345003 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34b1866_2113_448d_8265_2c6577f170f0.slice/crio-8b6a3582562d079d355ca62622b15a6c5c011cf9ab1b42da6af220ad038b0357 WatchSource:0}: Error finding container 8b6a3582562d079d355ca62622b15a6c5c011cf9ab1b42da6af220ad038b0357: Status 404 returned error can't find the container with id 8b6a3582562d079d355ca62622b15a6c5c011cf9ab1b42da6af220ad038b0357 Oct 03 09:57:59 crc kubenswrapper[4990]: I1003 09:57:59.723955 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s25zf" event={"ID":"f34b1866-2113-448d-8265-2c6577f170f0","Type":"ContainerStarted","Data":"8b6a3582562d079d355ca62622b15a6c5c011cf9ab1b42da6af220ad038b0357"} Oct 03 09:58:00 crc kubenswrapper[4990]: I1003 09:58:00.735657 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s25zf" event={"ID":"f34b1866-2113-448d-8265-2c6577f170f0","Type":"ContainerStarted","Data":"77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854"} Oct 03 09:58:00 crc kubenswrapper[4990]: I1003 09:58:00.759325 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-s25zf" podStartSLOduration=1.6825078279999999 podStartE2EDuration="2.759295121s" podCreationTimestamp="2025-10-03 09:57:58 +0000 UTC" firstStartedPulling="2025-10-03 09:57:59.347294677 +0000 UTC m=+861.143926534" lastFinishedPulling="2025-10-03 09:58:00.42408193 +0000 UTC m=+862.220713827" observedRunningTime="2025-10-03 09:58:00.758754936 +0000 UTC m=+862.555386813" watchObservedRunningTime="2025-10-03 09:58:00.759295121 +0000 UTC m=+862.555926998" Oct 03 09:58:01 crc kubenswrapper[4990]: I1003 09:58:01.891627 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s25zf"] Oct 03 09:58:02 crc kubenswrapper[4990]: I1003 09:58:02.703280 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rdszf"] Oct 03 09:58:02 crc kubenswrapper[4990]: I1003 09:58:02.704679 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:02 crc kubenswrapper[4990]: I1003 09:58:02.714945 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rdszf"] Oct 03 09:58:02 crc kubenswrapper[4990]: I1003 09:58:02.750373 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-s25zf" podUID="f34b1866-2113-448d-8265-2c6577f170f0" containerName="registry-server" containerID="cri-o://77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854" gracePeriod=2 Oct 03 09:58:02 crc kubenswrapper[4990]: I1003 09:58:02.821182 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvrjm\" (UniqueName: \"kubernetes.io/projected/022e23c3-ee2b-46c0-a326-7cc338bd845e-kube-api-access-wvrjm\") pod \"openstack-operator-index-rdszf\" (UID: \"022e23c3-ee2b-46c0-a326-7cc338bd845e\") " pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:02 crc kubenswrapper[4990]: I1003 09:58:02.922581 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvrjm\" (UniqueName: \"kubernetes.io/projected/022e23c3-ee2b-46c0-a326-7cc338bd845e-kube-api-access-wvrjm\") pod \"openstack-operator-index-rdszf\" (UID: \"022e23c3-ee2b-46c0-a326-7cc338bd845e\") " pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:02 crc kubenswrapper[4990]: I1003 09:58:02.949176 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvrjm\" (UniqueName: \"kubernetes.io/projected/022e23c3-ee2b-46c0-a326-7cc338bd845e-kube-api-access-wvrjm\") pod \"openstack-operator-index-rdszf\" (UID: \"022e23c3-ee2b-46c0-a326-7cc338bd845e\") " pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.023411 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.180826 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s25zf" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.226660 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nv27\" (UniqueName: \"kubernetes.io/projected/f34b1866-2113-448d-8265-2c6577f170f0-kube-api-access-6nv27\") pod \"f34b1866-2113-448d-8265-2c6577f170f0\" (UID: \"f34b1866-2113-448d-8265-2c6577f170f0\") " Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.233108 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34b1866-2113-448d-8265-2c6577f170f0-kube-api-access-6nv27" (OuterVolumeSpecName: "kube-api-access-6nv27") pod "f34b1866-2113-448d-8265-2c6577f170f0" (UID: "f34b1866-2113-448d-8265-2c6577f170f0"). InnerVolumeSpecName "kube-api-access-6nv27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.328247 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nv27\" (UniqueName: \"kubernetes.io/projected/f34b1866-2113-448d-8265-2c6577f170f0-kube-api-access-6nv27\") on node \"crc\" DevicePath \"\"" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.446164 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rdszf"] Oct 03 09:58:03 crc kubenswrapper[4990]: W1003 09:58:03.459306 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod022e23c3_ee2b_46c0_a326_7cc338bd845e.slice/crio-af057708bc1c08784710db32bc06502423f1a6e3411163d421140b7b27ab5422 WatchSource:0}: Error finding container af057708bc1c08784710db32bc06502423f1a6e3411163d421140b7b27ab5422: Status 404 returned error can't find the container with id af057708bc1c08784710db32bc06502423f1a6e3411163d421140b7b27ab5422 Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.763161 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rdszf" event={"ID":"022e23c3-ee2b-46c0-a326-7cc338bd845e","Type":"ContainerStarted","Data":"af057708bc1c08784710db32bc06502423f1a6e3411163d421140b7b27ab5422"} Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.766329 4990 generic.go:334] "Generic (PLEG): container finished" podID="f34b1866-2113-448d-8265-2c6577f170f0" containerID="77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854" exitCode=0 Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.766394 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s25zf" event={"ID":"f34b1866-2113-448d-8265-2c6577f170f0","Type":"ContainerDied","Data":"77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854"} Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.766448 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s25zf" event={"ID":"f34b1866-2113-448d-8265-2c6577f170f0","Type":"ContainerDied","Data":"8b6a3582562d079d355ca62622b15a6c5c011cf9ab1b42da6af220ad038b0357"} Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.766452 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s25zf" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.766474 4990 scope.go:117] "RemoveContainer" containerID="77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.795617 4990 scope.go:117] "RemoveContainer" containerID="77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854" Oct 03 09:58:03 crc kubenswrapper[4990]: E1003 09:58:03.796497 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854\": container with ID starting with 77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854 not found: ID does not exist" containerID="77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.796725 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854"} err="failed to get container status \"77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854\": rpc error: code = NotFound desc = could not find container \"77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854\": container with ID starting with 77e1a7ae18c33f7c343ab6b5d4067dedebae12a872623e89e272ede4a8b27854 not found: ID does not exist" Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.816025 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s25zf"] Oct 03 09:58:03 crc kubenswrapper[4990]: I1003 09:58:03.821150 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-s25zf"] Oct 03 09:58:04 crc kubenswrapper[4990]: I1003 09:58:04.774828 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rdszf" event={"ID":"022e23c3-ee2b-46c0-a326-7cc338bd845e","Type":"ContainerStarted","Data":"69de1d7b06318b22168fa340eccb8e1e94b04f8ddac49504255a4724c6716833"} Oct 03 09:58:04 crc kubenswrapper[4990]: I1003 09:58:04.800681 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rdszf" podStartSLOduration=2.3816295 podStartE2EDuration="2.80065319s" podCreationTimestamp="2025-10-03 09:58:02 +0000 UTC" firstStartedPulling="2025-10-03 09:58:03.472748894 +0000 UTC m=+865.269380761" lastFinishedPulling="2025-10-03 09:58:03.891772594 +0000 UTC m=+865.688404451" observedRunningTime="2025-10-03 09:58:04.795201218 +0000 UTC m=+866.591833095" watchObservedRunningTime="2025-10-03 09:58:04.80065319 +0000 UTC m=+866.597285087" Oct 03 09:58:04 crc kubenswrapper[4990]: I1003 09:58:04.882257 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34b1866-2113-448d-8265-2c6577f170f0" path="/var/lib/kubelet/pods/f34b1866-2113-448d-8265-2c6577f170f0/volumes" Oct 03 09:58:13 crc kubenswrapper[4990]: I1003 09:58:13.024030 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:13 crc kubenswrapper[4990]: I1003 09:58:13.024675 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:13 crc kubenswrapper[4990]: I1003 09:58:13.054218 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:13 crc kubenswrapper[4990]: I1003 09:58:13.872360 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rdszf" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.337404 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl"] Oct 03 09:58:15 crc kubenswrapper[4990]: E1003 09:58:15.339496 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34b1866-2113-448d-8265-2c6577f170f0" containerName="registry-server" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.339626 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34b1866-2113-448d-8265-2c6577f170f0" containerName="registry-server" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.339906 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34b1866-2113-448d-8265-2c6577f170f0" containerName="registry-server" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.341144 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.344041 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6dkrn" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.351945 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl"] Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.505430 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-util\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.505915 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h8kd\" (UniqueName: \"kubernetes.io/projected/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-kube-api-access-6h8kd\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.506155 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-bundle\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.607552 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-bundle\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.607643 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-util\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.607676 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h8kd\" (UniqueName: \"kubernetes.io/projected/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-kube-api-access-6h8kd\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.608430 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-bundle\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.608655 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-util\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.627042 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h8kd\" (UniqueName: \"kubernetes.io/projected/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-kube-api-access-6h8kd\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.665186 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:15 crc kubenswrapper[4990]: I1003 09:58:15.917659 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl"] Oct 03 09:58:16 crc kubenswrapper[4990]: I1003 09:58:16.877159 4990 generic.go:334] "Generic (PLEG): container finished" podID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerID="609796f7d03e27ed35639ed1634fb2650d11a85dce6abbb427742a2a95b45558" exitCode=0 Oct 03 09:58:16 crc kubenswrapper[4990]: I1003 09:58:16.885575 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" event={"ID":"51a737d8-6e16-425c-9dfc-bdcc5e8d3608","Type":"ContainerDied","Data":"609796f7d03e27ed35639ed1634fb2650d11a85dce6abbb427742a2a95b45558"} Oct 03 09:58:16 crc kubenswrapper[4990]: I1003 09:58:16.885657 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" event={"ID":"51a737d8-6e16-425c-9dfc-bdcc5e8d3608","Type":"ContainerStarted","Data":"4893966ecf5eb9a2b44aa5dec0b9f85db61d28feeed0c457a936b437bbcde14a"} Oct 03 09:58:18 crc kubenswrapper[4990]: I1003 09:58:18.899044 4990 generic.go:334] "Generic (PLEG): container finished" podID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerID="0a27374a029ff6b919cc356e118ce919198f1ae44ebbd3054200d057327769ab" exitCode=0 Oct 03 09:58:18 crc kubenswrapper[4990]: I1003 09:58:18.899275 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" event={"ID":"51a737d8-6e16-425c-9dfc-bdcc5e8d3608","Type":"ContainerDied","Data":"0a27374a029ff6b919cc356e118ce919198f1ae44ebbd3054200d057327769ab"} Oct 03 09:58:19 crc kubenswrapper[4990]: I1003 09:58:19.918494 4990 generic.go:334] "Generic (PLEG): container finished" podID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerID="a47e3120caa49ea0ee1f396d161118c663a13865e90730db85f6e5c9a075beb7" exitCode=0 Oct 03 09:58:19 crc kubenswrapper[4990]: I1003 09:58:19.918720 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" event={"ID":"51a737d8-6e16-425c-9dfc-bdcc5e8d3608","Type":"ContainerDied","Data":"a47e3120caa49ea0ee1f396d161118c663a13865e90730db85f6e5c9a075beb7"} Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.205417 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.404275 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h8kd\" (UniqueName: \"kubernetes.io/projected/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-kube-api-access-6h8kd\") pod \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.404417 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-util\") pod \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.404839 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-bundle\") pod \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\" (UID: \"51a737d8-6e16-425c-9dfc-bdcc5e8d3608\") " Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.406460 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-bundle" (OuterVolumeSpecName: "bundle") pod "51a737d8-6e16-425c-9dfc-bdcc5e8d3608" (UID: "51a737d8-6e16-425c-9dfc-bdcc5e8d3608"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.414727 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-kube-api-access-6h8kd" (OuterVolumeSpecName: "kube-api-access-6h8kd") pod "51a737d8-6e16-425c-9dfc-bdcc5e8d3608" (UID: "51a737d8-6e16-425c-9dfc-bdcc5e8d3608"). InnerVolumeSpecName "kube-api-access-6h8kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.418287 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-util" (OuterVolumeSpecName: "util") pod "51a737d8-6e16-425c-9dfc-bdcc5e8d3608" (UID: "51a737d8-6e16-425c-9dfc-bdcc5e8d3608"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.506593 4990 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.506637 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h8kd\" (UniqueName: \"kubernetes.io/projected/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-kube-api-access-6h8kd\") on node \"crc\" DevicePath \"\"" Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.506650 4990 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51a737d8-6e16-425c-9dfc-bdcc5e8d3608-util\") on node \"crc\" DevicePath \"\"" Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.943735 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" event={"ID":"51a737d8-6e16-425c-9dfc-bdcc5e8d3608","Type":"ContainerDied","Data":"4893966ecf5eb9a2b44aa5dec0b9f85db61d28feeed0c457a936b437bbcde14a"} Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.943808 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4893966ecf5eb9a2b44aa5dec0b9f85db61d28feeed0c457a936b437bbcde14a" Oct 03 09:58:21 crc kubenswrapper[4990]: I1003 09:58:21.943844 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl" Oct 03 09:58:25 crc kubenswrapper[4990]: I1003 09:58:25.304034 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:58:25 crc kubenswrapper[4990]: I1003 09:58:25.304138 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.531815 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7"] Oct 03 09:58:27 crc kubenswrapper[4990]: E1003 09:58:27.532554 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerName="util" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.532571 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerName="util" Oct 03 09:58:27 crc kubenswrapper[4990]: E1003 09:58:27.532591 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerName="pull" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.532599 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerName="pull" Oct 03 09:58:27 crc kubenswrapper[4990]: E1003 09:58:27.532616 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerName="extract" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.532626 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerName="extract" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.532759 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a737d8-6e16-425c-9dfc-bdcc5e8d3608" containerName="extract" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.533451 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.535178 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-d7q7x" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.569309 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7"] Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.711339 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fgb8\" (UniqueName: \"kubernetes.io/projected/d55415e3-b106-4fb0-8df5-c8a268332331-kube-api-access-7fgb8\") pod \"openstack-operator-controller-operator-86f8d7b75f-zmfd7\" (UID: \"d55415e3-b106-4fb0-8df5-c8a268332331\") " pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.813243 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fgb8\" (UniqueName: \"kubernetes.io/projected/d55415e3-b106-4fb0-8df5-c8a268332331-kube-api-access-7fgb8\") pod \"openstack-operator-controller-operator-86f8d7b75f-zmfd7\" (UID: \"d55415e3-b106-4fb0-8df5-c8a268332331\") " pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.835390 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fgb8\" (UniqueName: \"kubernetes.io/projected/d55415e3-b106-4fb0-8df5-c8a268332331-kube-api-access-7fgb8\") pod \"openstack-operator-controller-operator-86f8d7b75f-zmfd7\" (UID: \"d55415e3-b106-4fb0-8df5-c8a268332331\") " pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" Oct 03 09:58:27 crc kubenswrapper[4990]: I1003 09:58:27.853766 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" Oct 03 09:58:28 crc kubenswrapper[4990]: I1003 09:58:28.139914 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7"] Oct 03 09:58:29 crc kubenswrapper[4990]: I1003 09:58:29.007834 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" event={"ID":"d55415e3-b106-4fb0-8df5-c8a268332331","Type":"ContainerStarted","Data":"13e0b3bdff84f8ff0c8612ce22859827b5bfde06e78e4864c28eab8951ab3db4"} Oct 03 09:58:33 crc kubenswrapper[4990]: I1003 09:58:33.038056 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" event={"ID":"d55415e3-b106-4fb0-8df5-c8a268332331","Type":"ContainerStarted","Data":"6f1ba0f88affcac5682cf44f66d6c578b0c9189d1dd196337ef60f9e3e3a0a72"} Oct 03 09:58:36 crc kubenswrapper[4990]: I1003 09:58:36.059889 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" event={"ID":"d55415e3-b106-4fb0-8df5-c8a268332331","Type":"ContainerStarted","Data":"5150bc1df0e9934a9d1fd634541fbded8a2d1513033827a50acd08a73602da1a"} Oct 03 09:58:36 crc kubenswrapper[4990]: I1003 09:58:36.060348 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" Oct 03 09:58:36 crc kubenswrapper[4990]: I1003 09:58:36.094376 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" podStartSLOduration=2.13401221 podStartE2EDuration="9.094357044s" podCreationTimestamp="2025-10-03 09:58:27 +0000 UTC" firstStartedPulling="2025-10-03 09:58:28.150487682 +0000 UTC m=+889.947119549" lastFinishedPulling="2025-10-03 09:58:35.110832526 +0000 UTC m=+896.907464383" observedRunningTime="2025-10-03 09:58:36.091153081 +0000 UTC m=+897.887784948" watchObservedRunningTime="2025-10-03 09:58:36.094357044 +0000 UTC m=+897.890988901" Oct 03 09:58:37 crc kubenswrapper[4990]: I1003 09:58:37.070217 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-zmfd7" Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.915876 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q"] Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.917633 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.922893 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xmp9s" Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.926802 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb"] Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.941050 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q"] Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.941236 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.941995 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb"] Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.946380 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xxgql" Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.981868 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l"] Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.983373 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.988895 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-dm6tm" Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.990162 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm"] Oct 03 09:58:52 crc kubenswrapper[4990]: I1003 09:58:52.996724 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.000647 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.001019 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-h7bvr" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.007393 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.066442 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.069252 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.077569 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-9k7k2" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.081098 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.084457 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.085841 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.088656 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wrrzp" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.090470 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.095056 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.097008 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.099862 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrnxj\" (UniqueName: \"kubernetes.io/projected/2e80486e-af20-49c4-9aba-fc7a312ed0b6-kube-api-access-mrnxj\") pod \"cinder-operator-controller-manager-8686fd99f7-vhpqb\" (UID: \"2e80486e-af20-49c4-9aba-fc7a312ed0b6\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.100304 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.100412 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pnxt\" (UniqueName: \"kubernetes.io/projected/7c0910c3-4982-4e25-9c19-f65aef6c1dc8-kube-api-access-2pnxt\") pod \"designate-operator-controller-manager-58d86cd59d-nls5l\" (UID: \"7c0910c3-4982-4e25-9c19-f65aef6c1dc8\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.100437 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.100456 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk4tb\" (UniqueName: \"kubernetes.io/projected/466406f5-5f91-433d-859e-966713b4d752-kube-api-access-pk4tb\") pod \"glance-operator-controller-manager-d785ddfd5-h88gm\" (UID: \"466406f5-5f91-433d-859e-966713b4d752\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.100573 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7csxk\" (UniqueName: \"kubernetes.io/projected/638beb05-4965-4934-90de-4b06ff173650-kube-api-access-7csxk\") pod \"barbican-operator-controller-manager-6d6d64fdcf-t2d2q\" (UID: \"638beb05-4965-4934-90de-4b06ff173650\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.100393 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-pcf8n" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.160258 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.161765 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.164212 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-856gc" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.185189 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.190436 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.191852 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.194649 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fkt7k" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.195338 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.196833 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.198259 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-4p6cs" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202022 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb978\" (UniqueName: \"kubernetes.io/projected/a6826d26-170b-42c6-a519-599c8873c53a-kube-api-access-qb978\") pod \"horizon-operator-controller-manager-586b66cf4f-llmqm\" (UID: \"a6826d26-170b-42c6-a519-599c8873c53a\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202077 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pnxt\" (UniqueName: \"kubernetes.io/projected/7c0910c3-4982-4e25-9c19-f65aef6c1dc8-kube-api-access-2pnxt\") pod \"designate-operator-controller-manager-58d86cd59d-nls5l\" (UID: \"7c0910c3-4982-4e25-9c19-f65aef6c1dc8\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202117 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk4tb\" (UniqueName: \"kubernetes.io/projected/466406f5-5f91-433d-859e-966713b4d752-kube-api-access-pk4tb\") pod \"glance-operator-controller-manager-d785ddfd5-h88gm\" (UID: \"466406f5-5f91-433d-859e-966713b4d752\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202147 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7csxk\" (UniqueName: \"kubernetes.io/projected/638beb05-4965-4934-90de-4b06ff173650-kube-api-access-7csxk\") pod \"barbican-operator-controller-manager-6d6d64fdcf-t2d2q\" (UID: \"638beb05-4965-4934-90de-4b06ff173650\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202204 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert\") pod \"infra-operator-controller-manager-7c9978f67-vhtvr\" (UID: \"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202236 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdzzc\" (UniqueName: \"kubernetes.io/projected/d97d66ac-8e4e-414b-bad7-1f4323ee7ac5-kube-api-access-tdzzc\") pod \"keystone-operator-controller-manager-6c9969c6c6-hwbjj\" (UID: \"d97d66ac-8e4e-414b-bad7-1f4323ee7ac5\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202273 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrnxj\" (UniqueName: \"kubernetes.io/projected/2e80486e-af20-49c4-9aba-fc7a312ed0b6-kube-api-access-mrnxj\") pod \"cinder-operator-controller-manager-8686fd99f7-vhpqb\" (UID: \"2e80486e-af20-49c4-9aba-fc7a312ed0b6\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202295 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkh2t\" (UniqueName: \"kubernetes.io/projected/cac8d3c8-ac83-4af9-a285-4020d5978c74-kube-api-access-rkh2t\") pod \"ironic-operator-controller-manager-59b5fc9845-8vxtx\" (UID: \"cac8d3c8-ac83-4af9-a285-4020d5978c74\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202317 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxdk\" (UniqueName: \"kubernetes.io/projected/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-kube-api-access-8cxdk\") pod \"infra-operator-controller-manager-7c9978f67-vhtvr\" (UID: \"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202338 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49zl\" (UniqueName: \"kubernetes.io/projected/d863cc9f-06fe-4045-bb72-f2d3c36d14f3-kube-api-access-b49zl\") pod \"heat-operator-controller-manager-5ffbdb7ddf-fcvl4\" (UID: \"d863cc9f-06fe-4045-bb72-f2d3c36d14f3\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.202361 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cf89\" (UniqueName: \"kubernetes.io/projected/28afe97f-598e-4e6f-95b5-3e72e3082504-kube-api-access-5cf89\") pod \"manila-operator-controller-manager-66fdd975d9-nc6dd\" (UID: \"28afe97f-598e-4e6f-95b5-3e72e3082504\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.233785 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrnxj\" (UniqueName: \"kubernetes.io/projected/2e80486e-af20-49c4-9aba-fc7a312ed0b6-kube-api-access-mrnxj\") pod \"cinder-operator-controller-manager-8686fd99f7-vhpqb\" (UID: \"2e80486e-af20-49c4-9aba-fc7a312ed0b6\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.258257 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7csxk\" (UniqueName: \"kubernetes.io/projected/638beb05-4965-4934-90de-4b06ff173650-kube-api-access-7csxk\") pod \"barbican-operator-controller-manager-6d6d64fdcf-t2d2q\" (UID: \"638beb05-4965-4934-90de-4b06ff173650\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.274303 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.276615 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.279912 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.281111 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.292575 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pnxt\" (UniqueName: \"kubernetes.io/projected/7c0910c3-4982-4e25-9c19-f65aef6c1dc8-kube-api-access-2pnxt\") pod \"designate-operator-controller-manager-58d86cd59d-nls5l\" (UID: \"7c0910c3-4982-4e25-9c19-f65aef6c1dc8\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.296043 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.297064 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk4tb\" (UniqueName: \"kubernetes.io/projected/466406f5-5f91-433d-859e-966713b4d752-kube-api-access-pk4tb\") pod \"glance-operator-controller-manager-d785ddfd5-h88gm\" (UID: \"466406f5-5f91-433d-859e-966713b4d752\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.297609 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.305767 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.306616 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-f2pkn" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.307431 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdzzc\" (UniqueName: \"kubernetes.io/projected/d97d66ac-8e4e-414b-bad7-1f4323ee7ac5-kube-api-access-tdzzc\") pod \"keystone-operator-controller-manager-6c9969c6c6-hwbjj\" (UID: \"d97d66ac-8e4e-414b-bad7-1f4323ee7ac5\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.307478 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkh2t\" (UniqueName: \"kubernetes.io/projected/cac8d3c8-ac83-4af9-a285-4020d5978c74-kube-api-access-rkh2t\") pod \"ironic-operator-controller-manager-59b5fc9845-8vxtx\" (UID: \"cac8d3c8-ac83-4af9-a285-4020d5978c74\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.307499 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxdk\" (UniqueName: \"kubernetes.io/projected/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-kube-api-access-8cxdk\") pod \"infra-operator-controller-manager-7c9978f67-vhtvr\" (UID: \"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.308268 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49zl\" (UniqueName: \"kubernetes.io/projected/d863cc9f-06fe-4045-bb72-f2d3c36d14f3-kube-api-access-b49zl\") pod \"heat-operator-controller-manager-5ffbdb7ddf-fcvl4\" (UID: \"d863cc9f-06fe-4045-bb72-f2d3c36d14f3\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.308315 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cf89\" (UniqueName: \"kubernetes.io/projected/28afe97f-598e-4e6f-95b5-3e72e3082504-kube-api-access-5cf89\") pod \"manila-operator-controller-manager-66fdd975d9-nc6dd\" (UID: \"28afe97f-598e-4e6f-95b5-3e72e3082504\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.308346 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb978\" (UniqueName: \"kubernetes.io/projected/a6826d26-170b-42c6-a519-599c8873c53a-kube-api-access-qb978\") pod \"horizon-operator-controller-manager-586b66cf4f-llmqm\" (UID: \"a6826d26-170b-42c6-a519-599c8873c53a\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.308406 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert\") pod \"infra-operator-controller-manager-7c9978f67-vhtvr\" (UID: \"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:53 crc kubenswrapper[4990]: E1003 09:58:53.308544 4990 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 09:58:53 crc kubenswrapper[4990]: E1003 09:58:53.308602 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert podName:0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e nodeName:}" failed. No retries permitted until 2025-10-03 09:58:53.808579757 +0000 UTC m=+915.605211614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert") pod "infra-operator-controller-manager-7c9978f67-vhtvr" (UID: "0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e") : secret "infra-operator-webhook-server-cert" not found Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.308734 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-298td" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.309083 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.334632 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.367796 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkh2t\" (UniqueName: \"kubernetes.io/projected/cac8d3c8-ac83-4af9-a285-4020d5978c74-kube-api-access-rkh2t\") pod \"ironic-operator-controller-manager-59b5fc9845-8vxtx\" (UID: \"cac8d3c8-ac83-4af9-a285-4020d5978c74\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.371796 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.380744 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49zl\" (UniqueName: \"kubernetes.io/projected/d863cc9f-06fe-4045-bb72-f2d3c36d14f3-kube-api-access-b49zl\") pod \"heat-operator-controller-manager-5ffbdb7ddf-fcvl4\" (UID: \"d863cc9f-06fe-4045-bb72-f2d3c36d14f3\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.380847 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxdk\" (UniqueName: \"kubernetes.io/projected/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-kube-api-access-8cxdk\") pod \"infra-operator-controller-manager-7c9978f67-vhtvr\" (UID: \"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.384490 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb978\" (UniqueName: \"kubernetes.io/projected/a6826d26-170b-42c6-a519-599c8873c53a-kube-api-access-qb978\") pod \"horizon-operator-controller-manager-586b66cf4f-llmqm\" (UID: \"a6826d26-170b-42c6-a519-599c8873c53a\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.410064 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdzzc\" (UniqueName: \"kubernetes.io/projected/d97d66ac-8e4e-414b-bad7-1f4323ee7ac5-kube-api-access-tdzzc\") pod \"keystone-operator-controller-manager-6c9969c6c6-hwbjj\" (UID: \"d97d66ac-8e4e-414b-bad7-1f4323ee7ac5\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.412750 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cf89\" (UniqueName: \"kubernetes.io/projected/28afe97f-598e-4e6f-95b5-3e72e3082504-kube-api-access-5cf89\") pod \"manila-operator-controller-manager-66fdd975d9-nc6dd\" (UID: \"28afe97f-598e-4e6f-95b5-3e72e3082504\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.422164 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.477713 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.479063 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.479641 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.480846 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6rgf\" (UniqueName: \"kubernetes.io/projected/8c8ec33a-ded3-49b2-9f80-41f52b2d2833-kube-api-access-z6rgf\") pod \"mariadb-operator-controller-manager-696ff4bcdd-m6x7p\" (UID: \"8c8ec33a-ded3-49b2-9f80-41f52b2d2833\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.481132 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmq2\" (UniqueName: \"kubernetes.io/projected/aec15413-8ac4-4d63-a4d3-f28754800047-kube-api-access-7lmq2\") pod \"neutron-operator-controller-manager-549fb68678-d2sm6\" (UID: \"aec15413-8ac4-4d63-a4d3-f28754800047\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.480935 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.519422 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.538076 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.540609 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.542088 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.556068 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gfcns" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.569629 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.582290 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxh8n\" (UniqueName: \"kubernetes.io/projected/568468c9-631c-4cb7-bdcb-a6a6696c23f7-kube-api-access-qxh8n\") pod \"nova-operator-controller-manager-5b45478b88-r7r9l\" (UID: \"568468c9-631c-4cb7-bdcb-a6a6696c23f7\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.591447 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmq2\" (UniqueName: \"kubernetes.io/projected/aec15413-8ac4-4d63-a4d3-f28754800047-kube-api-access-7lmq2\") pod \"neutron-operator-controller-manager-549fb68678-d2sm6\" (UID: \"aec15413-8ac4-4d63-a4d3-f28754800047\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.592462 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6rgf\" (UniqueName: \"kubernetes.io/projected/8c8ec33a-ded3-49b2-9f80-41f52b2d2833-kube-api-access-z6rgf\") pod \"mariadb-operator-controller-manager-696ff4bcdd-m6x7p\" (UID: \"8c8ec33a-ded3-49b2-9f80-41f52b2d2833\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.591992 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.593599 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.598016 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.611503 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.612656 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.611762 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.618420 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.622206 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wk4wb" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.679472 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmq2\" (UniqueName: \"kubernetes.io/projected/aec15413-8ac4-4d63-a4d3-f28754800047-kube-api-access-7lmq2\") pod \"neutron-operator-controller-manager-549fb68678-d2sm6\" (UID: \"aec15413-8ac4-4d63-a4d3-f28754800047\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.703394 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.703896 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.704495 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.704616 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6rgf\" (UniqueName: \"kubernetes.io/projected/8c8ec33a-ded3-49b2-9f80-41f52b2d2833-kube-api-access-z6rgf\") pod \"mariadb-operator-controller-manager-696ff4bcdd-m6x7p\" (UID: \"8c8ec33a-ded3-49b2-9f80-41f52b2d2833\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.722257 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxh8n\" (UniqueName: \"kubernetes.io/projected/568468c9-631c-4cb7-bdcb-a6a6696c23f7-kube-api-access-qxh8n\") pod \"nova-operator-controller-manager-5b45478b88-r7r9l\" (UID: \"568468c9-631c-4cb7-bdcb-a6a6696c23f7\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.722321 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8tnr\" (UniqueName: \"kubernetes.io/projected/66d8cf5e-b670-4023-8e90-de7accf71f47-kube-api-access-s8tnr\") pod \"placement-operator-controller-manager-ccbfcb8c-8dtdl\" (UID: \"66d8cf5e-b670-4023-8e90-de7accf71f47\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.740315 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-l4mqc" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.740712 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-d95rq" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.740881 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vk57d" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.751426 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.752360 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.757967 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.758767 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-pz4fl" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.777993 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.794640 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.810299 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.810384 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxh8n\" (UniqueName: \"kubernetes.io/projected/568468c9-631c-4cb7-bdcb-a6a6696c23f7-kube-api-access-qxh8n\") pod \"nova-operator-controller-manager-5b45478b88-r7r9l\" (UID: \"568468c9-631c-4cb7-bdcb-a6a6696c23f7\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.825259 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4jh\" (UniqueName: \"kubernetes.io/projected/6683b324-a42c-419b-85a6-35386e0016bb-kube-api-access-9t4jh\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2\" (UID: \"6683b324-a42c-419b-85a6-35386e0016bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.825303 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzpnh\" (UniqueName: \"kubernetes.io/projected/e76b0ac3-f0b9-4a88-b495-c6430a9568fe-kube-api-access-bzpnh\") pod \"octavia-operator-controller-manager-b4444585c-rhx7t\" (UID: \"e76b0ac3-f0b9-4a88-b495-c6430a9568fe\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.825347 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8tq8\" (UniqueName: \"kubernetes.io/projected/2143aa33-b534-4237-b15b-e63c30a4672d-kube-api-access-l8tq8\") pod \"ovn-operator-controller-manager-855d7949fc-qljt8\" (UID: \"2143aa33-b534-4237-b15b-e63c30a4672d\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.825373 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvz4c\" (UniqueName: \"kubernetes.io/projected/5e6f315d-894b-4bcc-89fa-4cfa08e9cf88-kube-api-access-bvz4c\") pod \"swift-operator-controller-manager-76d5577b-2j7gl\" (UID: \"5e6f315d-894b-4bcc-89fa-4cfa08e9cf88\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.825391 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6683b324-a42c-419b-85a6-35386e0016bb-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2\" (UID: \"6683b324-a42c-419b-85a6-35386e0016bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.825429 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8tnr\" (UniqueName: \"kubernetes.io/projected/66d8cf5e-b670-4023-8e90-de7accf71f47-kube-api-access-s8tnr\") pod \"placement-operator-controller-manager-ccbfcb8c-8dtdl\" (UID: \"66d8cf5e-b670-4023-8e90-de7accf71f47\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.825455 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert\") pod \"infra-operator-controller-manager-7c9978f67-vhtvr\" (UID: \"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:53 crc kubenswrapper[4990]: E1003 09:58:53.825659 4990 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 09:58:53 crc kubenswrapper[4990]: E1003 09:58:53.825704 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert podName:0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e nodeName:}" failed. No retries permitted until 2025-10-03 09:58:54.825688107 +0000 UTC m=+916.622319964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert") pod "infra-operator-controller-manager-7c9978f67-vhtvr" (UID: "0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e") : secret "infra-operator-webhook-server-cert" not found Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.852128 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.856267 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.874602 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.876061 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.901268 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6r8kz" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.926420 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4jh\" (UniqueName: \"kubernetes.io/projected/6683b324-a42c-419b-85a6-35386e0016bb-kube-api-access-9t4jh\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2\" (UID: \"6683b324-a42c-419b-85a6-35386e0016bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.926469 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzpnh\" (UniqueName: \"kubernetes.io/projected/e76b0ac3-f0b9-4a88-b495-c6430a9568fe-kube-api-access-bzpnh\") pod \"octavia-operator-controller-manager-b4444585c-rhx7t\" (UID: \"e76b0ac3-f0b9-4a88-b495-c6430a9568fe\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.926537 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8tq8\" (UniqueName: \"kubernetes.io/projected/2143aa33-b534-4237-b15b-e63c30a4672d-kube-api-access-l8tq8\") pod \"ovn-operator-controller-manager-855d7949fc-qljt8\" (UID: \"2143aa33-b534-4237-b15b-e63c30a4672d\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.926570 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvz4c\" (UniqueName: \"kubernetes.io/projected/5e6f315d-894b-4bcc-89fa-4cfa08e9cf88-kube-api-access-bvz4c\") pod \"swift-operator-controller-manager-76d5577b-2j7gl\" (UID: \"5e6f315d-894b-4bcc-89fa-4cfa08e9cf88\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.926591 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6683b324-a42c-419b-85a6-35386e0016bb-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2\" (UID: \"6683b324-a42c-419b-85a6-35386e0016bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:53 crc kubenswrapper[4990]: E1003 09:58:53.926764 4990 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 09:58:53 crc kubenswrapper[4990]: E1003 09:58:53.926811 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6683b324-a42c-419b-85a6-35386e0016bb-cert podName:6683b324-a42c-419b-85a6-35386e0016bb nodeName:}" failed. No retries permitted until 2025-10-03 09:58:54.426795535 +0000 UTC m=+916.223427392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6683b324-a42c-419b-85a6-35386e0016bb-cert") pod "openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" (UID: "6683b324-a42c-419b-85a6-35386e0016bb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.931745 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl"] Oct 03 09:58:53 crc kubenswrapper[4990]: I1003 09:58:53.962407 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:53.995573 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8tnr\" (UniqueName: \"kubernetes.io/projected/66d8cf5e-b670-4023-8e90-de7accf71f47-kube-api-access-s8tnr\") pod \"placement-operator-controller-manager-ccbfcb8c-8dtdl\" (UID: \"66d8cf5e-b670-4023-8e90-de7accf71f47\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:53.998600 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8tq8\" (UniqueName: \"kubernetes.io/projected/2143aa33-b534-4237-b15b-e63c30a4672d-kube-api-access-l8tq8\") pod \"ovn-operator-controller-manager-855d7949fc-qljt8\" (UID: \"2143aa33-b534-4237-b15b-e63c30a4672d\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.002100 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzpnh\" (UniqueName: \"kubernetes.io/projected/e76b0ac3-f0b9-4a88-b495-c6430a9568fe-kube-api-access-bzpnh\") pod \"octavia-operator-controller-manager-b4444585c-rhx7t\" (UID: \"e76b0ac3-f0b9-4a88-b495-c6430a9568fe\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.005236 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4jh\" (UniqueName: \"kubernetes.io/projected/6683b324-a42c-419b-85a6-35386e0016bb-kube-api-access-9t4jh\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2\" (UID: \"6683b324-a42c-419b-85a6-35386e0016bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.010267 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvz4c\" (UniqueName: \"kubernetes.io/projected/5e6f315d-894b-4bcc-89fa-4cfa08e9cf88-kube-api-access-bvz4c\") pod \"swift-operator-controller-manager-76d5577b-2j7gl\" (UID: \"5e6f315d-894b-4bcc-89fa-4cfa08e9cf88\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.017812 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.028201 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8b2t\" (UniqueName: \"kubernetes.io/projected/8596d634-51fe-4c96-a347-4d99b57bb821-kube-api-access-w8b2t\") pod \"telemetry-operator-controller-manager-5ffb97cddf-cwfzj\" (UID: \"8596d634-51fe-4c96-a347-4d99b57bb821\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.030946 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.031111 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.045887 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-b9qr8" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.060629 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.066011 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.071163 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2j5kh" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.113550 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.121846 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.151446 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zf9t\" (UniqueName: \"kubernetes.io/projected/669348d2-73cb-4efe-93b1-95f08fe54e82-kube-api-access-4zf9t\") pod \"test-operator-controller-manager-6bb6dcddc-6c9g6\" (UID: \"669348d2-73cb-4efe-93b1-95f08fe54e82\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.151546 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r48km\" (UniqueName: \"kubernetes.io/projected/b3964048-aaf9-4f7b-ba1b-4dba06b0f702-kube-api-access-r48km\") pod \"watcher-operator-controller-manager-5595cf6c95-77swg\" (UID: \"b3964048-aaf9-4f7b-ba1b-4dba06b0f702\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.151730 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8b2t\" (UniqueName: \"kubernetes.io/projected/8596d634-51fe-4c96-a347-4d99b57bb821-kube-api-access-w8b2t\") pod \"telemetry-operator-controller-manager-5ffb97cddf-cwfzj\" (UID: \"8596d634-51fe-4c96-a347-4d99b57bb821\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.189373 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8b2t\" (UniqueName: \"kubernetes.io/projected/8596d634-51fe-4c96-a347-4d99b57bb821-kube-api-access-w8b2t\") pod \"telemetry-operator-controller-manager-5ffb97cddf-cwfzj\" (UID: \"8596d634-51fe-4c96-a347-4d99b57bb821\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.225587 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.226833 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.234026 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.273471 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rv5tl" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.279201 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.335565 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert\") pod \"openstack-operator-controller-manager-54df7874c5-glg9l\" (UID: \"d9fab51c-9ec1-4099-9dd5-0177bb096874\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.335731 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkbj5\" (UniqueName: \"kubernetes.io/projected/d9fab51c-9ec1-4099-9dd5-0177bb096874-kube-api-access-pkbj5\") pod \"openstack-operator-controller-manager-54df7874c5-glg9l\" (UID: \"d9fab51c-9ec1-4099-9dd5-0177bb096874\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.337027 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.337884 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zf9t\" (UniqueName: \"kubernetes.io/projected/669348d2-73cb-4efe-93b1-95f08fe54e82-kube-api-access-4zf9t\") pod \"test-operator-controller-manager-6bb6dcddc-6c9g6\" (UID: \"669348d2-73cb-4efe-93b1-95f08fe54e82\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.337984 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r48km\" (UniqueName: \"kubernetes.io/projected/b3964048-aaf9-4f7b-ba1b-4dba06b0f702-kube-api-access-r48km\") pod \"watcher-operator-controller-manager-5595cf6c95-77swg\" (UID: \"b3964048-aaf9-4f7b-ba1b-4dba06b0f702\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.339718 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.340343 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.349106 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.408372 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zf9t\" (UniqueName: \"kubernetes.io/projected/669348d2-73cb-4efe-93b1-95f08fe54e82-kube-api-access-4zf9t\") pod \"test-operator-controller-manager-6bb6dcddc-6c9g6\" (UID: \"669348d2-73cb-4efe-93b1-95f08fe54e82\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.417556 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r48km\" (UniqueName: \"kubernetes.io/projected/b3964048-aaf9-4f7b-ba1b-4dba06b0f702-kube-api-access-r48km\") pod \"watcher-operator-controller-manager-5595cf6c95-77swg\" (UID: \"b3964048-aaf9-4f7b-ba1b-4dba06b0f702\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.438621 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.455884 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert\") pod \"openstack-operator-controller-manager-54df7874c5-glg9l\" (UID: \"d9fab51c-9ec1-4099-9dd5-0177bb096874\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.460979 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkbj5\" (UniqueName: \"kubernetes.io/projected/d9fab51c-9ec1-4099-9dd5-0177bb096874-kube-api-access-pkbj5\") pod \"openstack-operator-controller-manager-54df7874c5-glg9l\" (UID: \"d9fab51c-9ec1-4099-9dd5-0177bb096874\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.461173 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6683b324-a42c-419b-85a6-35386e0016bb-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2\" (UID: \"6683b324-a42c-419b-85a6-35386e0016bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:54 crc kubenswrapper[4990]: E1003 09:58:54.456375 4990 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 09:58:54 crc kubenswrapper[4990]: E1003 09:58:54.462499 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert podName:d9fab51c-9ec1-4099-9dd5-0177bb096874 nodeName:}" failed. No retries permitted until 2025-10-03 09:58:54.962457136 +0000 UTC m=+916.759088993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert") pod "openstack-operator-controller-manager-54df7874c5-glg9l" (UID: "d9fab51c-9ec1-4099-9dd5-0177bb096874") : secret "webhook-server-cert" not found Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.477988 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.492383 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6683b324-a42c-419b-85a6-35386e0016bb-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2\" (UID: \"6683b324-a42c-419b-85a6-35386e0016bb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.499642 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.507918 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.528093 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.529744 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.533860 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-tw2dd" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.536625 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkbj5\" (UniqueName: \"kubernetes.io/projected/d9fab51c-9ec1-4099-9dd5-0177bb096874-kube-api-access-pkbj5\") pod \"openstack-operator-controller-manager-54df7874c5-glg9l\" (UID: \"d9fab51c-9ec1-4099-9dd5-0177bb096874\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.542128 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.623632 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.673986 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xx4\" (UniqueName: \"kubernetes.io/projected/fc10a0dc-8a4d-4c45-abeb-376dc7344f48-kube-api-access-p5xx4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-frzpd\" (UID: \"fc10a0dc-8a4d-4c45-abeb-376dc7344f48\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.776874 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.780541 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xx4\" (UniqueName: \"kubernetes.io/projected/fc10a0dc-8a4d-4c45-abeb-376dc7344f48-kube-api-access-p5xx4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-frzpd\" (UID: \"fc10a0dc-8a4d-4c45-abeb-376dc7344f48\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.790785 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.824828 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xx4\" (UniqueName: \"kubernetes.io/projected/fc10a0dc-8a4d-4c45-abeb-376dc7344f48-kube-api-access-p5xx4\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-frzpd\" (UID: \"fc10a0dc-8a4d-4c45-abeb-376dc7344f48\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.881900 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert\") pod \"infra-operator-controller-manager-7c9978f67-vhtvr\" (UID: \"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.895141 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e-cert\") pod \"infra-operator-controller-manager-7c9978f67-vhtvr\" (UID: \"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.898508 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd"] Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.907879 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd" Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.959867 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:58:54 crc kubenswrapper[4990]: W1003 09:58:54.974954 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod466406f5_5f91_433d_859e_966713b4d752.slice/crio-c23189060b01c7be7b930f5a64d0e4cac6eb17c0a606f8715928b3790723f78b WatchSource:0}: Error finding container c23189060b01c7be7b930f5a64d0e4cac6eb17c0a606f8715928b3790723f78b: Status 404 returned error can't find the container with id c23189060b01c7be7b930f5a64d0e4cac6eb17c0a606f8715928b3790723f78b Oct 03 09:58:54 crc kubenswrapper[4990]: I1003 09:58:54.984607 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert\") pod \"openstack-operator-controller-manager-54df7874c5-glg9l\" (UID: \"d9fab51c-9ec1-4099-9dd5-0177bb096874\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:54 crc kubenswrapper[4990]: E1003 09:58:54.984861 4990 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 09:58:54 crc kubenswrapper[4990]: E1003 09:58:54.984917 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert podName:d9fab51c-9ec1-4099-9dd5-0177bb096874 nodeName:}" failed. No retries permitted until 2025-10-03 09:58:55.984901465 +0000 UTC m=+917.781533322 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert") pod "openstack-operator-controller-manager-54df7874c5-glg9l" (UID: "d9fab51c-9ec1-4099-9dd5-0177bb096874") : secret "webhook-server-cert" not found Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.249946 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" event={"ID":"466406f5-5f91-433d-859e-966713b4d752","Type":"ContainerStarted","Data":"c23189060b01c7be7b930f5a64d0e4cac6eb17c0a606f8715928b3790723f78b"} Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.253255 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" event={"ID":"d863cc9f-06fe-4045-bb72-f2d3c36d14f3","Type":"ContainerStarted","Data":"ab4ae411e0f4d9d58bd41577014dd200b3b245705d50384825cc93604f9f1632"} Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.260486 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" event={"ID":"7c0910c3-4982-4e25-9c19-f65aef6c1dc8","Type":"ContainerStarted","Data":"46b36fd3679b52c377708ea0b6b0b4cb1d4be6fd8acf586f9e3792a20a6c9e1c"} Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.302003 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" event={"ID":"28afe97f-598e-4e6f-95b5-3e72e3082504","Type":"ContainerStarted","Data":"a3139041f8424e34104fdfec3454a510828b7df6d8e840ee91c58bcb441c312c"} Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.303616 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.303696 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.311047 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" event={"ID":"2e80486e-af20-49c4-9aba-fc7a312ed0b6","Type":"ContainerStarted","Data":"acb4e62801eeae055f0cafc1340b26cb7ebcb2cda6c74a2c1c450041fc6cd6c1"} Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.373016 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p"] Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.389834 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q"] Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.401388 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx"] Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.713248 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6"] Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.732486 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8"] Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.741054 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm"] Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.747187 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj"] Oct 03 09:58:55 crc kubenswrapper[4990]: W1003 09:58:55.801388 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6826d26_170b_42c6_a519_599c8873c53a.slice/crio-4d947355c4619df084e7abf525adceb2d32f3c81b769d022f08f7b26de9e4cee WatchSource:0}: Error finding container 4d947355c4619df084e7abf525adceb2d32f3c81b769d022f08f7b26de9e4cee: Status 404 returned error can't find the container with id 4d947355c4619df084e7abf525adceb2d32f3c81b769d022f08f7b26de9e4cee Oct 03 09:58:55 crc kubenswrapper[4990]: W1003 09:58:55.825651 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2143aa33_b534_4237_b15b_e63c30a4672d.slice/crio-5de5e8297de2af85fbde5e04f5d4b43392d04783fde430429c3a8343e6516bb3 WatchSource:0}: Error finding container 5de5e8297de2af85fbde5e04f5d4b43392d04783fde430429c3a8343e6516bb3: Status 404 returned error can't find the container with id 5de5e8297de2af85fbde5e04f5d4b43392d04783fde430429c3a8343e6516bb3 Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.913297 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t"] Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.919389 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl"] Oct 03 09:58:55 crc kubenswrapper[4990]: I1003 09:58:55.933987 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l"] Oct 03 09:58:55 crc kubenswrapper[4990]: W1003 09:58:55.943677 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod568468c9_631c_4cb7_bdcb_a6a6696c23f7.slice/crio-25439cb235d57ee5498f8f215324ec672c7d384f69f6852c975f749df27d0296 WatchSource:0}: Error finding container 25439cb235d57ee5498f8f215324ec672c7d384f69f6852c975f749df27d0296: Status 404 returned error can't find the container with id 25439cb235d57ee5498f8f215324ec672c7d384f69f6852c975f749df27d0296 Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.016707 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert\") pod \"openstack-operator-controller-manager-54df7874c5-glg9l\" (UID: \"d9fab51c-9ec1-4099-9dd5-0177bb096874\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.023925 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9fab51c-9ec1-4099-9dd5-0177bb096874-cert\") pod \"openstack-operator-controller-manager-54df7874c5-glg9l\" (UID: \"d9fab51c-9ec1-4099-9dd5-0177bb096874\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.061924 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj"] Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.082961 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg"] Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.261993 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr"] Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.262606 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd"] Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.267549 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2"] Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.288330 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6"] Oct 03 09:58:56 crc kubenswrapper[4990]: E1003 09:58:56.288489 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4zf9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6bb6dcddc-6c9g6_openstack-operators(669348d2-73cb-4efe-93b1-95f08fe54e82): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 09:58:56 crc kubenswrapper[4990]: W1003 09:58:56.290500 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6683b324_a42c_419b_85a6_35386e0016bb.slice/crio-13de27dd132cb5ac44f97df9ec555ea1b5383f51c2a67917c1559076b0075408 WatchSource:0}: Error finding container 13de27dd132cb5ac44f97df9ec555ea1b5383f51c2a67917c1559076b0075408: Status 404 returned error can't find the container with id 13de27dd132cb5ac44f97df9ec555ea1b5383f51c2a67917c1559076b0075408 Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.306374 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl"] Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.320485 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:58:56 crc kubenswrapper[4990]: E1003 09:58:56.346370 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:d1fad97d2cd602a4f7b6fd6c202464ac117b20e6608c17aa04cadbceb78a498d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:1c99923410d4cd0a721d2cc8a51d91d3ac800d5fda508c972ebe1e85ed2ca4d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:af4e2467469edf3b1fa739ef819ead98dfa934542ae40ec3266d58f66ba44f99,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:99f246f3b9bad7c46b671da12cd166614f0573b3dbf0aa04f4b32d4a9f5a81c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:d617f09ab1f6ef522c6f70db597cf20ab79ccebf25e225653cbf2e999354a5c0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:1c73b7b1034524ecfb36ce1eaa37ecbbcd5cb3f7fee0149b3bce0b0170bae8ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:9e14abeaab473b6731830d9c5bf383bb52111c919c787aee06b833f8cd3f83b1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:0838a5c5edf54c1c8af59c93955f26e4eda6645297058780e0f61c77b65683d9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:c50baa554100db160210b65733f71d6d128e38f96fa0552819854c62ede75953,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:e43273f867316a0e03469d82dc37487d3cdd2b08b4a153ba270c7cae1749bf92,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:de50c7dd282aa3898f1d0a31ecb2a300688f1f234662e6bbe12f35f88b484083,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:31c0d98fec7ff16416903874af0addeff03a7e72ede256990f2a71589e8be5ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:ac586b71d28a6240b29f4b464b19fea812ffc81e1182d172570b4be5ac58ea70,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:a5df039c808a65a273073128a627d6700897d6ebf81a9c62412c7d06be3b9a6e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:8f09cdc578caa07e0b5a9ec4e96a251a6d7dd43b2ef1edacb56543c997c259e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:e870d0a1b0c758601a067bfccc539ca04222e0c867872f679cea5833e0fcbf94,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:8f112731484f983f272f4c95558ffa098e96e610ddc5130ee0f2b2a239e9058a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:277ac4620d95ce3fe2f552f59b82b70962ba024d498710adc45b863bcc7244ff,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:09eebb1f87217fbb0249f4ebc19192cd282833aac27103081160b8949dd4361c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:e17eb4221e8981df97744e5168a8c759abcd925c2a483d04e3fdecd78128dae4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:02f99d84c8cc2c59ac4b8d98f219a1138b0aed8e50f91f9326ef55db5c187cd8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:7a636c7f518127d4292aa5417113fd611b85ad49ddbc8273455aa2fe5066a533,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:61f617fd809b55b2eceeec84b3283757af80d1001659e80877ac69e9643ba89f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:0b083fceb6e323a30f4c7308a275ea88243420ef38df77ac322af302c4c4dd2d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:9e173574f9216e5c42498c3794075ead54b6850c66094c4be628b52063f5814c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:980d0d43a83e61b74634b46864c2070fcb26348f8bc5a3375f161703e4041d3d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:d561737cf54869c67a819635c4a10ca4a9ed21cc6046ffd4f17301670d9a25fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:941076bbb1577abd91f42e0f19b0a191f7e393135d823ed203b122875033888b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:2133db6669a24570a266e7c053fc71bbfadd16cd9cd0bc8b87633e73c03c4719,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:55682010f0f5aea02f59df1e0a827cc6915048b7545c25432fb0cb8501898d0b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:814536e8e4848f6612cd4ada641d46ae7d766878b89918fc5df11f3930747d3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:a4f12a27e60f17034ba47f57dba0c5ae3f9e3c6c681f2e417bb87cb132f502e7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2069e730d5ced0e278392077ad261a3c35bf5df1d88735441859f23e8e3ceb24,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:17b8c6c9fbcc7092cba64a264adb9af6decd7db24ee2c60607a9045d55031b51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:f0864392605772b30f07dcb67ec8bb75d5b779756c537983377044d899c1b099,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:d9a44db937205e4c4f2cd2d247d230de2eb9207089f35a7ae7cfb11301406fac,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:1ab1deb86e7e5ba67b4cd9f5974de6707e5a5948e8f01fc1156dbf5e452340a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:a895c2b3a12aa21f9541a76213b6058ce3252aca002d66025d5935f4ea5873c7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:e7c778fd348f881fea490bb9ddf465347068a60fcd65f9cbfedb615815bba2a2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:a21c91d6927d863be8aef3023a527bc3466a0ddffc018df0c970ce14396ceee0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:7053c79b8354195fd09a5ea1347ad49a35443923d4e4578f80615c63d83313d3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:4626ebaa9dbe27fc95b31a48e69397fadef7c9779670c01555f872873c393f74,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:c840d7e9775d7f7ed1c6700d973bef79318fe92ac6fc8ed0616dcec13ef95c92,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:fcb50aade382ff516554b84b45c742a5adafb460fd67bd0fa2fc7cbb30adf5c1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:54373b05fcd33538d153507943da0c118e303a5c61a19c6bbe79a0786fe8ce1d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:8c9be58185245280d7282e8973cc6e23e6b08520ce126aeb91cfbcef0c144690,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:676ba6130835d00defc3214769d5fe1827ee41420a05f8556f361aac502a7efc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:3dbd2ac58b5f64ab3cf3eef3c44a52f0ccd363568c0739a5d18d6b9c9edddf5d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:fded6f454a54e601894e06989243e8896f43940c77cd8f4c904fe43c120b1595,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:5d10c016b13499110b5f9ca2bccfaf6d2fd4298c9f02580d7208fe91850da0a6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:43f2c4ec2e38934288015cb5d5ae92941e8b3fa9a613539175641e2c16cfc0cc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:c506a314e354f1ab274c46f9969b254f820e7515bbd9a24c9877dfbb10ece37e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:96d4b699758dd3d408b4c672dbe4392fd09783b4dc60783389905d7220b6524c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:7a0f3de7dda85fba7ad2929c7b01a2d42c11df9fe83f47a8e499a9da51e7f48c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:49b5ae7f895266b90cf3c02503fb7146726e59ad782fdf88112ad6954112d7e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:19b3d48b3c29eaa3a6d76fc145e212389f245c077bbf24eb5c1de0c96f3f7190,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:227891a9f4821a92c49ddc27301303287d5632b6ac199e9fe402581f1831ec01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:26e3ada4b9fee357ef8bbb1c342b38c49c096ede8a498116e3753ad45354fb47,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:2789b45ae2a5a9a80e4864e691f9e32fb9c9e1938cf92bda7c07defdbc78cdc2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:8df8259e737625667b13897dc0094bf3d7ced54f414dda93293ad4cb68af1d43,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:2fe4f8c71e11a926450d6553e5cb5c7b2db5d0de8426aa969f30d3d566114ff8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:ab5265aef98352336f23b18080f3ba110250859dc0edc20819348311a4a53044,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:943eee724277e252795909137538a553ef5284c8103ad01b9be7b0138c66d14d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:a3c1b94a285064d150145340c06ad5b0afc4aa20caa74523f3972c19b1d1ea61,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:129e24971fee94cc60b5f440605f1512fb932a884e38e64122f38f11f942e3b9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:b96baffbb926f93936bd52f2a1ef4fe1d31bb469d6489e9fb67bf00b99156551,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:d659d1ffbbaff7c76fc96e6600dc9b03c53af2c9d63cfb4626dfb5831b7b1ad7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:e3fcd72e1a2790ca7db5d5c40c1ae597de4b020dd51debcab063352e6e5f7d79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:2504c0db038b850cdd6057fc50e109715a4453c386e4f4d4f901a20dc7b2036a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:35c124624fd84930496975032e22d57e517c5958e71ba63124a306a5949c71d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e3accbf4293c544194bd2151d4d0bd8b26828ddacda968bad5d5a6f05c2406db,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:75ce8c4f9c68aaba6cab59749e726b2f94d29ba7b7897b18112fe1bd350efd8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:6390af78808d7cd69a4f5c7cb88f47690e54c9b8838b9461f4b21c4127ce770c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:14489a8a681c482a643cb47fa90d0a3596b4570e13cfc760541ac80d37cd31b3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:87367a67c7cb73476fb8d08ba108da843ac61170381458608e778a33c024c0c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:d6123a9349d422888df97ee72d32643dd534f81c521f6f313c5d5e64e2db60c1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:b273fd1e1da4190dc4cc67469d180b66b5a22eb6ec9afc76ef36dd6ea2beaea5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:9561306ec9455914cd05a0a0b3e56d72c7164aa41d0f0ef9b03ac7d7343538b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:1115e5a2dce397b4a34a082cba1937903818ab5928048fcf775c4a4e6dda2d07,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t4jh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2_openstack-operators(6683b324-a42c-419b-85a6-35386e0016bb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.356259 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" event={"ID":"a6826d26-170b-42c6-a519-599c8873c53a","Type":"ContainerStarted","Data":"4d947355c4619df084e7abf525adceb2d32f3c81b769d022f08f7b26de9e4cee"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.362736 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" event={"ID":"b3964048-aaf9-4f7b-ba1b-4dba06b0f702","Type":"ContainerStarted","Data":"0fa7c27535631a91af4a83ff3b5fa6b43eb7853cc7b56751e5ef24b1e92a7dd7"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.365238 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" event={"ID":"aec15413-8ac4-4d63-a4d3-f28754800047","Type":"ContainerStarted","Data":"3095f7d9c153553de0ecabe118cfd715cd4347f04985bf8fbfa5ac695d8cade6"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.367415 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" event={"ID":"66d8cf5e-b670-4023-8e90-de7accf71f47","Type":"ContainerStarted","Data":"60c02dd271d5bd5e6428974860396a2d23c3c3179f836178cb1642af2e3c5d7c"} Oct 03 09:58:56 crc kubenswrapper[4990]: E1003 09:58:56.385950 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bvz4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-76d5577b-2j7gl_openstack-operators(5e6f315d-894b-4bcc-89fa-4cfa08e9cf88): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.390395 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" event={"ID":"8c8ec33a-ded3-49b2-9f80-41f52b2d2833","Type":"ContainerStarted","Data":"bdf4de84276035554291bc3133169d2b2d7c1015b2aef04c12ae2dee82eb8ddc"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.396295 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" event={"ID":"2143aa33-b534-4237-b15b-e63c30a4672d","Type":"ContainerStarted","Data":"5de5e8297de2af85fbde5e04f5d4b43392d04783fde430429c3a8343e6516bb3"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.398257 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" event={"ID":"638beb05-4965-4934-90de-4b06ff173650","Type":"ContainerStarted","Data":"624722d9b4f7180890ae346f8e03ffddc471c746f4053496578368755f7c8528"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.402081 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" event={"ID":"8596d634-51fe-4c96-a347-4d99b57bb821","Type":"ContainerStarted","Data":"105ad4f9d687272af3cb3f644402a98ab308d2ec20567c252e3133bff33bee65"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.404617 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" event={"ID":"568468c9-631c-4cb7-bdcb-a6a6696c23f7","Type":"ContainerStarted","Data":"25439cb235d57ee5498f8f215324ec672c7d384f69f6852c975f749df27d0296"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.407773 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" event={"ID":"d97d66ac-8e4e-414b-bad7-1f4323ee7ac5","Type":"ContainerStarted","Data":"a0fd48f19ded233b1a0c65f94de5ded2f8ae3e78fc6c63f07dc32e6d3becf631"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.411238 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd" event={"ID":"fc10a0dc-8a4d-4c45-abeb-376dc7344f48","Type":"ContainerStarted","Data":"7bae66a4963a44deb60ec9f056ecff0d26ff41dcbb2017f28d38952539b42430"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.413137 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" event={"ID":"cac8d3c8-ac83-4af9-a285-4020d5978c74","Type":"ContainerStarted","Data":"bfe69e5249a4120181ed86ca2c78b2737617ef7979b6854b44d8ec978abf5845"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.415526 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" event={"ID":"e76b0ac3-f0b9-4a88-b495-c6430a9568fe","Type":"ContainerStarted","Data":"f5ba8614f4d8631b9ca05ec1f5600e30b5e3546a37bfab296fcbd9f2b0bcd938"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.416755 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" event={"ID":"6683b324-a42c-419b-85a6-35386e0016bb","Type":"ContainerStarted","Data":"13de27dd132cb5ac44f97df9ec555ea1b5383f51c2a67917c1559076b0075408"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.418984 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" event={"ID":"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e","Type":"ContainerStarted","Data":"887029ec3d9f16f84da60a3d0bcd536dadec1dbfd9ecb2289407afc122620019"} Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.432196 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" event={"ID":"669348d2-73cb-4efe-93b1-95f08fe54e82","Type":"ContainerStarted","Data":"b01710cc765a243df0a559233b61623628f84f8f0bcf3b7ac802be48f1600f28"} Oct 03 09:58:56 crc kubenswrapper[4990]: E1003 09:58:56.667400 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" podUID="669348d2-73cb-4efe-93b1-95f08fe54e82" Oct 03 09:58:56 crc kubenswrapper[4990]: E1003 09:58:56.694085 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" podUID="6683b324-a42c-419b-85a6-35386e0016bb" Oct 03 09:58:56 crc kubenswrapper[4990]: E1003 09:58:56.711371 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" podUID="5e6f315d-894b-4bcc-89fa-4cfa08e9cf88" Oct 03 09:58:56 crc kubenswrapper[4990]: I1003 09:58:56.909273 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l"] Oct 03 09:58:57 crc kubenswrapper[4990]: I1003 09:58:57.500651 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" event={"ID":"d9fab51c-9ec1-4099-9dd5-0177bb096874","Type":"ContainerStarted","Data":"5d0dae847d1f5e491d93d7ff1957a8bbc85082e03474c89484f464f321011a55"} Oct 03 09:58:57 crc kubenswrapper[4990]: I1003 09:58:57.501191 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" event={"ID":"d9fab51c-9ec1-4099-9dd5-0177bb096874","Type":"ContainerStarted","Data":"091caae9702ebc12932b91c7ee396da2f89adc42c0c44243ecdbe87c89c6ae9a"} Oct 03 09:58:57 crc kubenswrapper[4990]: I1003 09:58:57.519299 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" event={"ID":"6683b324-a42c-419b-85a6-35386e0016bb","Type":"ContainerStarted","Data":"edfc76c7df171121bb1e366b0ed7433c73702ca7a4aa5d78ff50f65e1f2af142"} Oct 03 09:58:57 crc kubenswrapper[4990]: I1003 09:58:57.527588 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" event={"ID":"5e6f315d-894b-4bcc-89fa-4cfa08e9cf88","Type":"ContainerStarted","Data":"7d92afd3c0d716c6cc6fbbfdc9033611372f8cb9f71975a8a85f5e6fbd8e0f0c"} Oct 03 09:58:57 crc kubenswrapper[4990]: I1003 09:58:57.527651 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" event={"ID":"5e6f315d-894b-4bcc-89fa-4cfa08e9cf88","Type":"ContainerStarted","Data":"70a7a8eef622790f82796c5122db3eaf8c308a0e91edfc4c9afbc32a7f52c438"} Oct 03 09:58:57 crc kubenswrapper[4990]: E1003 09:58:57.530139 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" podUID="6683b324-a42c-419b-85a6-35386e0016bb" Oct 03 09:58:57 crc kubenswrapper[4990]: E1003 09:58:57.534206 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" podUID="5e6f315d-894b-4bcc-89fa-4cfa08e9cf88" Oct 03 09:58:57 crc kubenswrapper[4990]: I1003 09:58:57.591532 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" event={"ID":"669348d2-73cb-4efe-93b1-95f08fe54e82","Type":"ContainerStarted","Data":"945dd111d8e587e67ee4f4061dcef154928feb340159c25c66fdfa81d0dda540"} Oct 03 09:58:57 crc kubenswrapper[4990]: E1003 09:58:57.596594 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" podUID="669348d2-73cb-4efe-93b1-95f08fe54e82" Oct 03 09:58:58 crc kubenswrapper[4990]: E1003 09:58:58.607318 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" podUID="669348d2-73cb-4efe-93b1-95f08fe54e82" Oct 03 09:58:58 crc kubenswrapper[4990]: E1003 09:58:58.611738 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" podUID="5e6f315d-894b-4bcc-89fa-4cfa08e9cf88" Oct 03 09:58:58 crc kubenswrapper[4990]: E1003 09:58:58.613179 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" podUID="6683b324-a42c-419b-85a6-35386e0016bb" Oct 03 09:59:00 crc kubenswrapper[4990]: I1003 09:59:00.629447 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" event={"ID":"d9fab51c-9ec1-4099-9dd5-0177bb096874","Type":"ContainerStarted","Data":"5c8200d41dfa48c1a97ee35f329d72cf3e8387a26c890b7f56a862e8f9bef858"} Oct 03 09:59:00 crc kubenswrapper[4990]: I1003 09:59:00.630169 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:59:06 crc kubenswrapper[4990]: I1003 09:59:06.328728 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" Oct 03 09:59:06 crc kubenswrapper[4990]: I1003 09:59:06.366939 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-glg9l" podStartSLOduration=12.366918315 podStartE2EDuration="12.366918315s" podCreationTimestamp="2025-10-03 09:58:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:59:00.665669365 +0000 UTC m=+922.462301242" watchObservedRunningTime="2025-10-03 09:59:06.366918315 +0000 UTC m=+928.163550172" Oct 03 09:59:10 crc kubenswrapper[4990]: E1003 09:59:10.505735 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475" Oct 03 09:59:10 crc kubenswrapper[4990]: E1003 09:59:10.506437 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cxdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7c9978f67-vhtvr_openstack-operators(0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 09:59:10 crc kubenswrapper[4990]: E1003 09:59:10.919616 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" podUID="0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e" Oct 03 09:59:11 crc kubenswrapper[4990]: I1003 09:59:11.778927 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" event={"ID":"8c8ec33a-ded3-49b2-9f80-41f52b2d2833","Type":"ContainerStarted","Data":"a8c7b3eb394b7804ad0a1123d1cc750db982d60c818d853de6d10c11eb92e7c4"} Oct 03 09:59:11 crc kubenswrapper[4990]: I1003 09:59:11.802370 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" event={"ID":"638beb05-4965-4934-90de-4b06ff173650","Type":"ContainerStarted","Data":"5277ddcb8a706901e1238c71cb1226cb5c32e88cb877471bca8f37a40cd40c02"} Oct 03 09:59:11 crc kubenswrapper[4990]: I1003 09:59:11.825973 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" event={"ID":"2e80486e-af20-49c4-9aba-fc7a312ed0b6","Type":"ContainerStarted","Data":"f49885f7f72b2c39422d0b69174d105214dc3ea80f606a96fc46438f85587569"} Oct 03 09:59:11 crc kubenswrapper[4990]: I1003 09:59:11.857455 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" event={"ID":"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e","Type":"ContainerStarted","Data":"478f36105377649cb1629b2e892306d9f8d58a32e19e17bab12b0983fa2dfd81"} Oct 03 09:59:11 crc kubenswrapper[4990]: E1003 09:59:11.862939 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" podUID="0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e" Oct 03 09:59:11 crc kubenswrapper[4990]: I1003 09:59:11.906933 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" event={"ID":"d863cc9f-06fe-4045-bb72-f2d3c36d14f3","Type":"ContainerStarted","Data":"7c80df64582a8e3c01da76544ff988c52fe20109ec189ed2d5205552c5a9160f"} Oct 03 09:59:11 crc kubenswrapper[4990]: I1003 09:59:11.962727 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" event={"ID":"a6826d26-170b-42c6-a519-599c8873c53a","Type":"ContainerStarted","Data":"1a7cc65ceafbc3b7e7112371b16c2acba8e9ff483951ed3fd867524a6add01ab"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.007306 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" event={"ID":"466406f5-5f91-433d-859e-966713b4d752","Type":"ContainerStarted","Data":"3f1b4eac739729e4b8eb145e9bab683bb665ad7815150e0a4800c4d44d6f760d"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.017991 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" event={"ID":"e76b0ac3-f0b9-4a88-b495-c6430a9568fe","Type":"ContainerStarted","Data":"225458a5300704fe505b813e68cc677721e6cab2ba95b083fe7b445a33f6e9c3"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.028676 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" event={"ID":"8596d634-51fe-4c96-a347-4d99b57bb821","Type":"ContainerStarted","Data":"cee817b4b96380ac213150b609987c255ec7ab1955c52c3f4c89029aeb705880"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.077384 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" event={"ID":"aec15413-8ac4-4d63-a4d3-f28754800047","Type":"ContainerStarted","Data":"56a9bd23cdac68933cd8c92212936ee270513f6c66a851eec371d438e6179b1f"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.085231 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" event={"ID":"66d8cf5e-b670-4023-8e90-de7accf71f47","Type":"ContainerStarted","Data":"16e6305ba98141508876ce57a96fb8de8e055c893fcec2a40837ead128d3d874"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.107786 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" event={"ID":"7c0910c3-4982-4e25-9c19-f65aef6c1dc8","Type":"ContainerStarted","Data":"312f2150aa89c49e8129cdb993e8d5fe35c98cca47386a2d33d5cbc7fe0c44eb"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.135920 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" event={"ID":"28afe97f-598e-4e6f-95b5-3e72e3082504","Type":"ContainerStarted","Data":"7d50b5d92b8d4a377d96b46f688d29387e71d3821043357dba433d0b6b73a285"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.136478 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.150948 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" event={"ID":"d97d66ac-8e4e-414b-bad7-1f4323ee7ac5","Type":"ContainerStarted","Data":"034db9e55ccc2e773910d97b088352f12d6d75879d38df9c1d25b0c860d19bb0"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.161729 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd" event={"ID":"fc10a0dc-8a4d-4c45-abeb-376dc7344f48","Type":"ContainerStarted","Data":"a495040257e4e3270fd252936bd17ac396fbc3b4ba48461d2b3f2f3d18cd4c8a"} Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.183427 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" podStartSLOduration=3.535375183 podStartE2EDuration="19.183385139s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:54.962379021 +0000 UTC m=+916.759010888" lastFinishedPulling="2025-10-03 09:59:10.610388987 +0000 UTC m=+932.407020844" observedRunningTime="2025-10-03 09:59:12.165629479 +0000 UTC m=+933.962261346" watchObservedRunningTime="2025-10-03 09:59:12.183385139 +0000 UTC m=+933.980016996" Oct 03 09:59:12 crc kubenswrapper[4990]: I1003 09:59:12.211038 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-frzpd" podStartSLOduration=3.869555591 podStartE2EDuration="18.211005104s" podCreationTimestamp="2025-10-03 09:58:54 +0000 UTC" firstStartedPulling="2025-10-03 09:58:56.27380561 +0000 UTC m=+918.070437467" lastFinishedPulling="2025-10-03 09:59:10.615255123 +0000 UTC m=+932.411886980" observedRunningTime="2025-10-03 09:59:12.202910754 +0000 UTC m=+933.999542611" watchObservedRunningTime="2025-10-03 09:59:12.211005104 +0000 UTC m=+934.007636971" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.182203 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" event={"ID":"aec15413-8ac4-4d63-a4d3-f28754800047","Type":"ContainerStarted","Data":"5719801951b71a2fad86675aea61bb27b9971250ce0965c9c34ed1ebf5dc2cc3"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.183334 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.190399 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" event={"ID":"568468c9-631c-4cb7-bdcb-a6a6696c23f7","Type":"ContainerStarted","Data":"146d4f5caf0c10c4f2bb3edd9b67631c751ac0aa7b7e4146cd98a4d571f57d31"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.190457 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" event={"ID":"568468c9-631c-4cb7-bdcb-a6a6696c23f7","Type":"ContainerStarted","Data":"312e95125182659463cdb4f9df12696d135ce483eccf67562f158f7708978e26"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.191176 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.213988 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" podStartSLOduration=5.346413689 podStartE2EDuration="20.213960535s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.74705493 +0000 UTC m=+917.543686787" lastFinishedPulling="2025-10-03 09:59:10.614601776 +0000 UTC m=+932.411233633" observedRunningTime="2025-10-03 09:59:13.206119882 +0000 UTC m=+935.002751749" watchObservedRunningTime="2025-10-03 09:59:13.213960535 +0000 UTC m=+935.010592392" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.224636 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" event={"ID":"8c8ec33a-ded3-49b2-9f80-41f52b2d2833","Type":"ContainerStarted","Data":"0f58804427a5fc052b2923c9c29b7e283928b818f66290a8f30fde991d4517a7"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.225178 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.232968 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" event={"ID":"466406f5-5f91-433d-859e-966713b4d752","Type":"ContainerStarted","Data":"b5ae80d8d65240ab33f55175266d5ba0abfa36863d3ecf8a14396926088c1301"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.233674 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.239272 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" event={"ID":"2143aa33-b534-4237-b15b-e63c30a4672d","Type":"ContainerStarted","Data":"030ad9218e9617d022ad3adb8d7d94481235389f3f169e42b552fe600e5cc992"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.239404 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" event={"ID":"2143aa33-b534-4237-b15b-e63c30a4672d","Type":"ContainerStarted","Data":"12d620a95a66b822d8ccabada9b28b352cc831e48b7b8f2c14b0d7afb7ee40a3"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.240078 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.247953 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" podStartSLOduration=5.551714795 podStartE2EDuration="20.247931225s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.950370564 +0000 UTC m=+917.747002421" lastFinishedPulling="2025-10-03 09:59:10.646586984 +0000 UTC m=+932.443218851" observedRunningTime="2025-10-03 09:59:13.240959464 +0000 UTC m=+935.037591331" watchObservedRunningTime="2025-10-03 09:59:13.247931225 +0000 UTC m=+935.044563082" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.248434 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" event={"ID":"66d8cf5e-b670-4023-8e90-de7accf71f47","Type":"ContainerStarted","Data":"6a8ad9693800b8a95bb392c269849a43fa88ffbe526cab063bb6f229f7adf606"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.250754 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.259974 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" event={"ID":"7c0910c3-4982-4e25-9c19-f65aef6c1dc8","Type":"ContainerStarted","Data":"90b2f409dc01b22d977deb01ef673df2f108156dd8cabf8d497a37a6fb15efab"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.260682 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.262039 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" event={"ID":"28afe97f-598e-4e6f-95b5-3e72e3082504","Type":"ContainerStarted","Data":"b439d02fbb532b23594ff0dfa5042a9563a7e1973a74e3880d4c35a85003dffb"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.266425 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" event={"ID":"2e80486e-af20-49c4-9aba-fc7a312ed0b6","Type":"ContainerStarted","Data":"865da91709cc694ab8d450f711db6ecc17edda3a7b048976a968645c413aba25"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.267503 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.274133 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" event={"ID":"8596d634-51fe-4c96-a347-4d99b57bb821","Type":"ContainerStarted","Data":"b96e6d05c228fc5a6699f75b24299c3e4aacb798a0142a2a1ea496b92c1257b5"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.274310 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.275016 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" podStartSLOduration=5.502726757 podStartE2EDuration="20.274992786s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.842007108 +0000 UTC m=+917.638638955" lastFinishedPulling="2025-10-03 09:59:10.614273107 +0000 UTC m=+932.410904984" observedRunningTime="2025-10-03 09:59:13.26860091 +0000 UTC m=+935.065232777" watchObservedRunningTime="2025-10-03 09:59:13.274992786 +0000 UTC m=+935.071624653" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.295086 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" event={"ID":"d97d66ac-8e4e-414b-bad7-1f4323ee7ac5","Type":"ContainerStarted","Data":"d5457f5061300cbfc29cff2302ab9e13eca7a30fd12733bc3ebca834fadf676f"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.297920 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.311345 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" podStartSLOduration=5.149907431 podStartE2EDuration="20.311318456s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.466346531 +0000 UTC m=+917.262978388" lastFinishedPulling="2025-10-03 09:59:10.627757556 +0000 UTC m=+932.424389413" observedRunningTime="2025-10-03 09:59:13.306540563 +0000 UTC m=+935.103172430" watchObservedRunningTime="2025-10-03 09:59:13.311318456 +0000 UTC m=+935.107950313" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.316864 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" event={"ID":"cac8d3c8-ac83-4af9-a285-4020d5978c74","Type":"ContainerStarted","Data":"f99dd48084786fe7fa2bd63099e2cd283d78bd3b0011706377065c0e6fd6bccc"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.316925 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" event={"ID":"cac8d3c8-ac83-4af9-a285-4020d5978c74","Type":"ContainerStarted","Data":"dc7797cd4b8dd000a0375fac8c9361a6bb68eef0273e5de8cdcac5cc771af917"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.317721 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.328224 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" event={"ID":"a6826d26-170b-42c6-a519-599c8873c53a","Type":"ContainerStarted","Data":"d9e567f595586590c97dca9cc36a009858fefe2fa0b6031f3cb9cd76358535b5"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.329476 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.338547 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" podStartSLOduration=5.750306434 podStartE2EDuration="21.338528041s" podCreationTimestamp="2025-10-03 09:58:52 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.022137219 +0000 UTC m=+916.818769076" lastFinishedPulling="2025-10-03 09:59:10.610358826 +0000 UTC m=+932.406990683" observedRunningTime="2025-10-03 09:59:13.331071138 +0000 UTC m=+935.127703015" watchObservedRunningTime="2025-10-03 09:59:13.338528041 +0000 UTC m=+935.135159898" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.340080 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" event={"ID":"638beb05-4965-4934-90de-4b06ff173650","Type":"ContainerStarted","Data":"067ccfe754bb4e7c4282b69e263a471b7ed8a88826cea898804013fc913e859a"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.340257 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.352408 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" event={"ID":"d863cc9f-06fe-4045-bb72-f2d3c36d14f3","Type":"ContainerStarted","Data":"b711f004f1c8029f91174bd075dc0b6f67bb5591efad8633ef6083ca3b34211a"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.353011 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.359724 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" event={"ID":"b3964048-aaf9-4f7b-ba1b-4dba06b0f702","Type":"ContainerStarted","Data":"e1489f5403ac1f1a91571ffd2fc38fe43aa533601cc134fe432c53cef172f58f"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.359826 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" event={"ID":"b3964048-aaf9-4f7b-ba1b-4dba06b0f702","Type":"ContainerStarted","Data":"798f848b74c099c9ba7ab555e6694e1efeba0cf39efa3fa62dbfd25a5f77d500"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.360049 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.360061 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" podStartSLOduration=5.235960788 podStartE2EDuration="20.360046637s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.491186914 +0000 UTC m=+917.287818771" lastFinishedPulling="2025-10-03 09:59:10.615272753 +0000 UTC m=+932.411904620" observedRunningTime="2025-10-03 09:59:13.355352847 +0000 UTC m=+935.151984714" watchObservedRunningTime="2025-10-03 09:59:13.360046637 +0000 UTC m=+935.156678524" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.362993 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" event={"ID":"e76b0ac3-f0b9-4a88-b495-c6430a9568fe","Type":"ContainerStarted","Data":"3306fe2d21326c933abac58a288eb40fcd97e787fae7b8733f29353014a49411"} Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.363827 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" Oct 03 09:59:13 crc kubenswrapper[4990]: E1003 09:59:13.367725 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" podUID="0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.379391 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" podStartSLOduration=5.391546673 podStartE2EDuration="21.379373818s" podCreationTimestamp="2025-10-03 09:58:52 +0000 UTC" firstStartedPulling="2025-10-03 09:58:54.627420458 +0000 UTC m=+916.424052315" lastFinishedPulling="2025-10-03 09:59:10.615247603 +0000 UTC m=+932.411879460" observedRunningTime="2025-10-03 09:59:13.378716181 +0000 UTC m=+935.175348058" watchObservedRunningTime="2025-10-03 09:59:13.379373818 +0000 UTC m=+935.176005675" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.431882 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" podStartSLOduration=5.231752745 podStartE2EDuration="21.431860277s" podCreationTimestamp="2025-10-03 09:58:52 +0000 UTC" firstStartedPulling="2025-10-03 09:58:54.410243874 +0000 UTC m=+916.206875731" lastFinishedPulling="2025-10-03 09:59:10.610351406 +0000 UTC m=+932.406983263" observedRunningTime="2025-10-03 09:59:13.428955961 +0000 UTC m=+935.225587818" watchObservedRunningTime="2025-10-03 09:59:13.431860277 +0000 UTC m=+935.228492134" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.433377 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" podStartSLOduration=5.657253597 podStartE2EDuration="20.433370756s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.830267164 +0000 UTC m=+917.626899021" lastFinishedPulling="2025-10-03 09:59:10.606384313 +0000 UTC m=+932.403016180" observedRunningTime="2025-10-03 09:59:13.399818567 +0000 UTC m=+935.196450434" watchObservedRunningTime="2025-10-03 09:59:13.433370756 +0000 UTC m=+935.230002613" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.455719 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" podStartSLOduration=5.917265 podStartE2EDuration="20.455689644s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:56.089280201 +0000 UTC m=+917.885912058" lastFinishedPulling="2025-10-03 09:59:10.627704845 +0000 UTC m=+932.424336702" observedRunningTime="2025-10-03 09:59:13.45322928 +0000 UTC m=+935.249861137" watchObservedRunningTime="2025-10-03 09:59:13.455689644 +0000 UTC m=+935.252321501" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.476325 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" podStartSLOduration=5.667738148 podStartE2EDuration="20.476297937s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.806441817 +0000 UTC m=+917.603073674" lastFinishedPulling="2025-10-03 09:59:10.615001606 +0000 UTC m=+932.411633463" observedRunningTime="2025-10-03 09:59:13.476246266 +0000 UTC m=+935.272878113" watchObservedRunningTime="2025-10-03 09:59:13.476297937 +0000 UTC m=+935.272929794" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.513127 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" podStartSLOduration=5.818571635 podStartE2EDuration="20.51309323s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.919794763 +0000 UTC m=+917.716426620" lastFinishedPulling="2025-10-03 09:59:10.614316358 +0000 UTC m=+932.410948215" observedRunningTime="2025-10-03 09:59:13.500191786 +0000 UTC m=+935.296823683" watchObservedRunningTime="2025-10-03 09:59:13.51309323 +0000 UTC m=+935.309725097" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.524148 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" podStartSLOduration=6.326193676 podStartE2EDuration="21.524122026s" podCreationTimestamp="2025-10-03 09:58:52 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.416660405 +0000 UTC m=+917.213292262" lastFinishedPulling="2025-10-03 09:59:10.614588755 +0000 UTC m=+932.411220612" observedRunningTime="2025-10-03 09:59:13.5211767 +0000 UTC m=+935.317808557" watchObservedRunningTime="2025-10-03 09:59:13.524122026 +0000 UTC m=+935.320753883" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.551599 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" podStartSLOduration=5.869229467 podStartE2EDuration="20.551577797s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:55.924050473 +0000 UTC m=+917.720682330" lastFinishedPulling="2025-10-03 09:59:10.606398803 +0000 UTC m=+932.403030660" observedRunningTime="2025-10-03 09:59:13.545277624 +0000 UTC m=+935.341909501" watchObservedRunningTime="2025-10-03 09:59:13.551577797 +0000 UTC m=+935.348209644" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.573928 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" podStartSLOduration=4.624975537 podStartE2EDuration="20.573909795s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:54.663468071 +0000 UTC m=+916.460099918" lastFinishedPulling="2025-10-03 09:59:10.612402299 +0000 UTC m=+932.409034176" observedRunningTime="2025-10-03 09:59:13.572884448 +0000 UTC m=+935.369516315" watchObservedRunningTime="2025-10-03 09:59:13.573909795 +0000 UTC m=+935.370541652" Oct 03 09:59:13 crc kubenswrapper[4990]: I1003 09:59:13.632626 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" podStartSLOduration=6.120642127 podStartE2EDuration="20.632608845s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:56.101611621 +0000 UTC m=+917.898243478" lastFinishedPulling="2025-10-03 09:59:10.613578339 +0000 UTC m=+932.410210196" observedRunningTime="2025-10-03 09:59:13.62817506 +0000 UTC m=+935.424806917" watchObservedRunningTime="2025-10-03 09:59:13.632608845 +0000 UTC m=+935.429240702" Oct 03 09:59:15 crc kubenswrapper[4990]: I1003 09:59:15.387281 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-hwbjj" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.280950 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-vhpqb" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.308898 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nls5l" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.448470 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-h88gm" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.486851 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-8vxtx" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.487363 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-fcvl4" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.545483 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-nc6dd" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.549546 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-t2d2q" Oct 03 09:59:23 crc kubenswrapper[4990]: E1003 09:59:23.660292 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed" Oct 03 09:59:23 crc kubenswrapper[4990]: E1003 09:59:23.660469 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bvz4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-76d5577b-2j7gl_openstack-operators(5e6f315d-894b-4bcc-89fa-4cfa08e9cf88): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 09:59:23 crc kubenswrapper[4990]: E1003 09:59:23.661844 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" podUID="5e6f315d-894b-4bcc-89fa-4cfa08e9cf88" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.756536 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-m6x7p" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.761579 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-llmqm" Oct 03 09:59:23 crc kubenswrapper[4990]: E1003 09:59:23.772873 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee" Oct 03 09:59:23 crc kubenswrapper[4990]: E1003 09:59:23.773350 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:d1fad97d2cd602a4f7b6fd6c202464ac117b20e6608c17aa04cadbceb78a498d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:1c99923410d4cd0a721d2cc8a51d91d3ac800d5fda508c972ebe1e85ed2ca4d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:af4e2467469edf3b1fa739ef819ead98dfa934542ae40ec3266d58f66ba44f99,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:99f246f3b9bad7c46b671da12cd166614f0573b3dbf0aa04f4b32d4a9f5a81c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:d617f09ab1f6ef522c6f70db597cf20ab79ccebf25e225653cbf2e999354a5c0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:1c73b7b1034524ecfb36ce1eaa37ecbbcd5cb3f7fee0149b3bce0b0170bae8ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:9e14abeaab473b6731830d9c5bf383bb52111c919c787aee06b833f8cd3f83b1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:0838a5c5edf54c1c8af59c93955f26e4eda6645297058780e0f61c77b65683d9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:c50baa554100db160210b65733f71d6d128e38f96fa0552819854c62ede75953,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:e43273f867316a0e03469d82dc37487d3cdd2b08b4a153ba270c7cae1749bf92,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:de50c7dd282aa3898f1d0a31ecb2a300688f1f234662e6bbe12f35f88b484083,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:31c0d98fec7ff16416903874af0addeff03a7e72ede256990f2a71589e8be5ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:ac586b71d28a6240b29f4b464b19fea812ffc81e1182d172570b4be5ac58ea70,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:a5df039c808a65a273073128a627d6700897d6ebf81a9c62412c7d06be3b9a6e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:8f09cdc578caa07e0b5a9ec4e96a251a6d7dd43b2ef1edacb56543c997c259e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:e870d0a1b0c758601a067bfccc539ca04222e0c867872f679cea5833e0fcbf94,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:8f112731484f983f272f4c95558ffa098e96e610ddc5130ee0f2b2a239e9058a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:277ac4620d95ce3fe2f552f59b82b70962ba024d498710adc45b863bcc7244ff,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:09eebb1f87217fbb0249f4ebc19192cd282833aac27103081160b8949dd4361c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:e17eb4221e8981df97744e5168a8c759abcd925c2a483d04e3fdecd78128dae4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:02f99d84c8cc2c59ac4b8d98f219a1138b0aed8e50f91f9326ef55db5c187cd8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:7a636c7f518127d4292aa5417113fd611b85ad49ddbc8273455aa2fe5066a533,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:61f617fd809b55b2eceeec84b3283757af80d1001659e80877ac69e9643ba89f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:0b083fceb6e323a30f4c7308a275ea88243420ef38df77ac322af302c4c4dd2d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:9e173574f9216e5c42498c3794075ead54b6850c66094c4be628b52063f5814c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:980d0d43a83e61b74634b46864c2070fcb26348f8bc5a3375f161703e4041d3d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:d561737cf54869c67a819635c4a10ca4a9ed21cc6046ffd4f17301670d9a25fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:941076bbb1577abd91f42e0f19b0a191f7e393135d823ed203b122875033888b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:2133db6669a24570a266e7c053fc71bbfadd16cd9cd0bc8b87633e73c03c4719,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:55682010f0f5aea02f59df1e0a827cc6915048b7545c25432fb0cb8501898d0b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:814536e8e4848f6612cd4ada641d46ae7d766878b89918fc5df11f3930747d3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:a4f12a27e60f17034ba47f57dba0c5ae3f9e3c6c681f2e417bb87cb132f502e7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2069e730d5ced0e278392077ad261a3c35bf5df1d88735441859f23e8e3ceb24,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:17b8c6c9fbcc7092cba64a264adb9af6decd7db24ee2c60607a9045d55031b51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:f0864392605772b30f07dcb67ec8bb75d5b779756c537983377044d899c1b099,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:d9a44db937205e4c4f2cd2d247d230de2eb9207089f35a7ae7cfb11301406fac,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:1ab1deb86e7e5ba67b4cd9f5974de6707e5a5948e8f01fc1156dbf5e452340a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:a895c2b3a12aa21f9541a76213b6058ce3252aca002d66025d5935f4ea5873c7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:e7c778fd348f881fea490bb9ddf465347068a60fcd65f9cbfedb615815bba2a2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:a21c91d6927d863be8aef3023a527bc3466a0ddffc018df0c970ce14396ceee0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:7053c79b8354195fd09a5ea1347ad49a35443923d4e4578f80615c63d83313d3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:4626ebaa9dbe27fc95b31a48e69397fadef7c9779670c01555f872873c393f74,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:c840d7e9775d7f7ed1c6700d973bef79318fe92ac6fc8ed0616dcec13ef95c92,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:fcb50aade382ff516554b84b45c742a5adafb460fd67bd0fa2fc7cbb30adf5c1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:54373b05fcd33538d153507943da0c118e303a5c61a19c6bbe79a0786fe8ce1d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:8c9be58185245280d7282e8973cc6e23e6b08520ce126aeb91cfbcef0c144690,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:676ba6130835d00defc3214769d5fe1827ee41420a05f8556f361aac502a7efc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:3dbd2ac58b5f64ab3cf3eef3c44a52f0ccd363568c0739a5d18d6b9c9edddf5d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:fded6f454a54e601894e06989243e8896f43940c77cd8f4c904fe43c120b1595,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:5d10c016b13499110b5f9ca2bccfaf6d2fd4298c9f02580d7208fe91850da0a6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:43f2c4ec2e38934288015cb5d5ae92941e8b3fa9a613539175641e2c16cfc0cc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:c506a314e354f1ab274c46f9969b254f820e7515bbd9a24c9877dfbb10ece37e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:96d4b699758dd3d408b4c672dbe4392fd09783b4dc60783389905d7220b6524c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:7a0f3de7dda85fba7ad2929c7b01a2d42c11df9fe83f47a8e499a9da51e7f48c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:49b5ae7f895266b90cf3c02503fb7146726e59ad782fdf88112ad6954112d7e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:19b3d48b3c29eaa3a6d76fc145e212389f245c077bbf24eb5c1de0c96f3f7190,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:227891a9f4821a92c49ddc27301303287d5632b6ac199e9fe402581f1831ec01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:26e3ada4b9fee357ef8bbb1c342b38c49c096ede8a498116e3753ad45354fb47,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:2789b45ae2a5a9a80e4864e691f9e32fb9c9e1938cf92bda7c07defdbc78cdc2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:8df8259e737625667b13897dc0094bf3d7ced54f414dda93293ad4cb68af1d43,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:2fe4f8c71e11a926450d6553e5cb5c7b2db5d0de8426aa969f30d3d566114ff8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:ab5265aef98352336f23b18080f3ba110250859dc0edc20819348311a4a53044,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:943eee724277e252795909137538a553ef5284c8103ad01b9be7b0138c66d14d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:a3c1b94a285064d150145340c06ad5b0afc4aa20caa74523f3972c19b1d1ea61,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:129e24971fee94cc60b5f440605f1512fb932a884e38e64122f38f11f942e3b9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:b96baffbb926f93936bd52f2a1ef4fe1d31bb469d6489e9fb67bf00b99156551,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:d659d1ffbbaff7c76fc96e6600dc9b03c53af2c9d63cfb4626dfb5831b7b1ad7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:e3fcd72e1a2790ca7db5d5c40c1ae597de4b020dd51debcab063352e6e5f7d79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:2504c0db038b850cdd6057fc50e109715a4453c386e4f4d4f901a20dc7b2036a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:35c124624fd84930496975032e22d57e517c5958e71ba63124a306a5949c71d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e3accbf4293c544194bd2151d4d0bd8b26828ddacda968bad5d5a6f05c2406db,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:75ce8c4f9c68aaba6cab59749e726b2f94d29ba7b7897b18112fe1bd350efd8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:6390af78808d7cd69a4f5c7cb88f47690e54c9b8838b9461f4b21c4127ce770c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:14489a8a681c482a643cb47fa90d0a3596b4570e13cfc760541ac80d37cd31b3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:87367a67c7cb73476fb8d08ba108da843ac61170381458608e778a33c024c0c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:d6123a9349d422888df97ee72d32643dd534f81c521f6f313c5d5e64e2db60c1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:b273fd1e1da4190dc4cc67469d180b66b5a22eb6ec9afc76ef36dd6ea2beaea5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:9561306ec9455914cd05a0a0b3e56d72c7164aa41d0f0ef9b03ac7d7343538b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:1115e5a2dce397b4a34a082cba1937903818ab5928048fcf775c4a4e6dda2d07,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t4jh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2_openstack-operators(6683b324-a42c-419b-85a6-35386e0016bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 09:59:23 crc kubenswrapper[4990]: E1003 09:59:23.774683 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" podUID="6683b324-a42c-419b-85a6-35386e0016bb" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.817913 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-d2sm6" Oct 03 09:59:23 crc kubenswrapper[4990]: I1003 09:59:23.862167 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-r7r9l" Oct 03 09:59:24 crc kubenswrapper[4990]: I1003 09:59:24.118343 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-qljt8" Oct 03 09:59:24 crc kubenswrapper[4990]: I1003 09:59:24.283015 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-8dtdl" Oct 03 09:59:24 crc kubenswrapper[4990]: I1003 09:59:24.344218 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-rhx7t" Oct 03 09:59:24 crc kubenswrapper[4990]: I1003 09:59:24.351900 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-cwfzj" Oct 03 09:59:24 crc kubenswrapper[4990]: I1003 09:59:24.459753 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" event={"ID":"669348d2-73cb-4efe-93b1-95f08fe54e82","Type":"ContainerStarted","Data":"15f907d69797145c2912c9427243b3b0370d4178f59078e3d8c5e9b8aa680ee5"} Oct 03 09:59:24 crc kubenswrapper[4990]: I1003 09:59:24.460049 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" Oct 03 09:59:24 crc kubenswrapper[4990]: I1003 09:59:24.478031 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" podStartSLOduration=3.963123929 podStartE2EDuration="31.47800744s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:56.288332066 +0000 UTC m=+918.084963923" lastFinishedPulling="2025-10-03 09:59:23.803215577 +0000 UTC m=+945.599847434" observedRunningTime="2025-10-03 09:59:24.476881991 +0000 UTC m=+946.273513858" watchObservedRunningTime="2025-10-03 09:59:24.47800744 +0000 UTC m=+946.274639297" Oct 03 09:59:24 crc kubenswrapper[4990]: I1003 09:59:24.491654 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-77swg" Oct 03 09:59:25 crc kubenswrapper[4990]: I1003 09:59:25.303708 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:59:25 crc kubenswrapper[4990]: I1003 09:59:25.303807 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:59:25 crc kubenswrapper[4990]: I1003 09:59:25.303877 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 09:59:25 crc kubenswrapper[4990]: I1003 09:59:25.304839 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9518ccde27cb06822f3f654faf41c25e94fa95c0d239e9864a7e2a64f0cb2e1"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:59:25 crc kubenswrapper[4990]: I1003 09:59:25.304983 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://f9518ccde27cb06822f3f654faf41c25e94fa95c0d239e9864a7e2a64f0cb2e1" gracePeriod=600 Oct 03 09:59:25 crc kubenswrapper[4990]: I1003 09:59:25.468613 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="f9518ccde27cb06822f3f654faf41c25e94fa95c0d239e9864a7e2a64f0cb2e1" exitCode=0 Oct 03 09:59:25 crc kubenswrapper[4990]: I1003 09:59:25.469396 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"f9518ccde27cb06822f3f654faf41c25e94fa95c0d239e9864a7e2a64f0cb2e1"} Oct 03 09:59:25 crc kubenswrapper[4990]: I1003 09:59:25.469593 4990 scope.go:117] "RemoveContainer" containerID="1e2ad6cff8936a8295aabe81e306e21bfb74b8894d7097a04dd75c58a8d9b278" Oct 03 09:59:26 crc kubenswrapper[4990]: I1003 09:59:26.481265 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"175c8521bcb98f1fac547c4c077e9a09006d6ca72a50d133b713f7d6d049ebb8"} Oct 03 09:59:28 crc kubenswrapper[4990]: I1003 09:59:28.497755 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" event={"ID":"0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e","Type":"ContainerStarted","Data":"6bf23954ed83ad0c60dac509c2f492b1661a430ea3962ffb9feffbbe086721be"} Oct 03 09:59:28 crc kubenswrapper[4990]: I1003 09:59:28.498690 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:59:28 crc kubenswrapper[4990]: I1003 09:59:28.516341 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" podStartSLOduration=4.425550672 podStartE2EDuration="35.516322249s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:56.279842766 +0000 UTC m=+918.076474623" lastFinishedPulling="2025-10-03 09:59:27.370614343 +0000 UTC m=+949.167246200" observedRunningTime="2025-10-03 09:59:28.514265636 +0000 UTC m=+950.310897493" watchObservedRunningTime="2025-10-03 09:59:28.516322249 +0000 UTC m=+950.312954116" Oct 03 09:59:34 crc kubenswrapper[4990]: I1003 09:59:34.442042 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-6c9g6" Oct 03 09:59:34 crc kubenswrapper[4990]: I1003 09:59:34.965859 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-vhtvr" Oct 03 09:59:35 crc kubenswrapper[4990]: E1003 09:59:35.875000 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" podUID="5e6f315d-894b-4bcc-89fa-4cfa08e9cf88" Oct 03 09:59:35 crc kubenswrapper[4990]: E1003 09:59:35.875051 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" podUID="6683b324-a42c-419b-85a6-35386e0016bb" Oct 03 09:59:48 crc kubenswrapper[4990]: I1003 09:59:48.644271 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" event={"ID":"6683b324-a42c-419b-85a6-35386e0016bb","Type":"ContainerStarted","Data":"a7302c5bc074a588e8aa0dd0f862e532ac2e10780191386b5be91c842bac6eef"} Oct 03 09:59:48 crc kubenswrapper[4990]: I1003 09:59:48.645157 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 09:59:48 crc kubenswrapper[4990]: I1003 09:59:48.678967 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" podStartSLOduration=3.645163185 podStartE2EDuration="55.678943001s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:56.344463849 +0000 UTC m=+918.141095706" lastFinishedPulling="2025-10-03 09:59:48.378243665 +0000 UTC m=+970.174875522" observedRunningTime="2025-10-03 09:59:48.674220099 +0000 UTC m=+970.470851956" watchObservedRunningTime="2025-10-03 09:59:48.678943001 +0000 UTC m=+970.475574868" Oct 03 09:59:49 crc kubenswrapper[4990]: I1003 09:59:49.656736 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" event={"ID":"5e6f315d-894b-4bcc-89fa-4cfa08e9cf88","Type":"ContainerStarted","Data":"ca5212473f7fa45a71f25f9f14bbcdb65df1446404dc7c72aa264fe9791b8a82"} Oct 03 09:59:49 crc kubenswrapper[4990]: I1003 09:59:49.657416 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" Oct 03 09:59:49 crc kubenswrapper[4990]: I1003 09:59:49.682115 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" podStartSLOduration=3.737733093 podStartE2EDuration="56.682083206s" podCreationTimestamp="2025-10-03 09:58:53 +0000 UTC" firstStartedPulling="2025-10-03 09:58:56.385715378 +0000 UTC m=+918.182347225" lastFinishedPulling="2025-10-03 09:59:49.330065481 +0000 UTC m=+971.126697338" observedRunningTime="2025-10-03 09:59:49.681409539 +0000 UTC m=+971.478041446" watchObservedRunningTime="2025-10-03 09:59:49.682083206 +0000 UTC m=+971.478715103" Oct 03 09:59:54 crc kubenswrapper[4990]: I1003 09:59:54.342591 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-76d5577b-2j7gl" Oct 03 09:59:54 crc kubenswrapper[4990]: I1003 09:59:54.782312 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.161845 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m"] Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.163220 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.166145 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.166145 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.174027 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m"] Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.246485 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7270bda0-0176-495e-8111-623e25e97ec6-config-volume\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.247390 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7270bda0-0176-495e-8111-623e25e97ec6-secret-volume\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.247605 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p5vb\" (UniqueName: \"kubernetes.io/projected/7270bda0-0176-495e-8111-623e25e97ec6-kube-api-access-6p5vb\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.349051 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7270bda0-0176-495e-8111-623e25e97ec6-secret-volume\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.349117 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p5vb\" (UniqueName: \"kubernetes.io/projected/7270bda0-0176-495e-8111-623e25e97ec6-kube-api-access-6p5vb\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.349174 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7270bda0-0176-495e-8111-623e25e97ec6-config-volume\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.350298 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7270bda0-0176-495e-8111-623e25e97ec6-config-volume\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.356171 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7270bda0-0176-495e-8111-623e25e97ec6-secret-volume\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.366282 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p5vb\" (UniqueName: \"kubernetes.io/projected/7270bda0-0176-495e-8111-623e25e97ec6-kube-api-access-6p5vb\") pod \"collect-profiles-29324760-9hf2m\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.483473 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:00 crc kubenswrapper[4990]: I1003 10:00:00.914011 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m"] Oct 03 10:00:01 crc kubenswrapper[4990]: I1003 10:00:01.751967 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" event={"ID":"7270bda0-0176-495e-8111-623e25e97ec6","Type":"ContainerDied","Data":"86f066598dfbcc192921b0771831b7c09f2cd78f86c7fe9092a8a99ca20ac41b"} Oct 03 10:00:01 crc kubenswrapper[4990]: I1003 10:00:01.751777 4990 generic.go:334] "Generic (PLEG): container finished" podID="7270bda0-0176-495e-8111-623e25e97ec6" containerID="86f066598dfbcc192921b0771831b7c09f2cd78f86c7fe9092a8a99ca20ac41b" exitCode=0 Oct 03 10:00:01 crc kubenswrapper[4990]: I1003 10:00:01.752554 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" event={"ID":"7270bda0-0176-495e-8111-623e25e97ec6","Type":"ContainerStarted","Data":"8d37e557d071b77b2040913ccc3f4e3c4f6586ba18c656b860c01027a2c8452a"} Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.112881 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.298092 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7270bda0-0176-495e-8111-623e25e97ec6-config-volume\") pod \"7270bda0-0176-495e-8111-623e25e97ec6\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.298193 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7270bda0-0176-495e-8111-623e25e97ec6-secret-volume\") pod \"7270bda0-0176-495e-8111-623e25e97ec6\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.298244 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p5vb\" (UniqueName: \"kubernetes.io/projected/7270bda0-0176-495e-8111-623e25e97ec6-kube-api-access-6p5vb\") pod \"7270bda0-0176-495e-8111-623e25e97ec6\" (UID: \"7270bda0-0176-495e-8111-623e25e97ec6\") " Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.299166 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7270bda0-0176-495e-8111-623e25e97ec6-config-volume" (OuterVolumeSpecName: "config-volume") pod "7270bda0-0176-495e-8111-623e25e97ec6" (UID: "7270bda0-0176-495e-8111-623e25e97ec6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.305119 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7270bda0-0176-495e-8111-623e25e97ec6-kube-api-access-6p5vb" (OuterVolumeSpecName: "kube-api-access-6p5vb") pod "7270bda0-0176-495e-8111-623e25e97ec6" (UID: "7270bda0-0176-495e-8111-623e25e97ec6"). InnerVolumeSpecName "kube-api-access-6p5vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.307448 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7270bda0-0176-495e-8111-623e25e97ec6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7270bda0-0176-495e-8111-623e25e97ec6" (UID: "7270bda0-0176-495e-8111-623e25e97ec6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.400314 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7270bda0-0176-495e-8111-623e25e97ec6-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.400785 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7270bda0-0176-495e-8111-623e25e97ec6-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.400860 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p5vb\" (UniqueName: \"kubernetes.io/projected/7270bda0-0176-495e-8111-623e25e97ec6-kube-api-access-6p5vb\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.769611 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" event={"ID":"7270bda0-0176-495e-8111-623e25e97ec6","Type":"ContainerDied","Data":"8d37e557d071b77b2040913ccc3f4e3c4f6586ba18c656b860c01027a2c8452a"} Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.769658 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d37e557d071b77b2040913ccc3f4e3c4f6586ba18c656b860c01027a2c8452a" Oct 03 10:00:03 crc kubenswrapper[4990]: I1003 10:00:03.769678 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.597561 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-47rpd"] Oct 03 10:00:10 crc kubenswrapper[4990]: E1003 10:00:10.598724 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7270bda0-0176-495e-8111-623e25e97ec6" containerName="collect-profiles" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.598741 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7270bda0-0176-495e-8111-623e25e97ec6" containerName="collect-profiles" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.598958 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7270bda0-0176-495e-8111-623e25e97ec6" containerName="collect-profiles" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.599947 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.602434 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7558f" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.602868 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.603942 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.604333 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.610591 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-47rpd"] Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.625329 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1890dca-166e-42ff-8033-9a54ab2d4536-config\") pod \"dnsmasq-dns-6d84845cb9-47rpd\" (UID: \"e1890dca-166e-42ff-8033-9a54ab2d4536\") " pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.625419 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r978m\" (UniqueName: \"kubernetes.io/projected/e1890dca-166e-42ff-8033-9a54ab2d4536-kube-api-access-r978m\") pod \"dnsmasq-dns-6d84845cb9-47rpd\" (UID: \"e1890dca-166e-42ff-8033-9a54ab2d4536\") " pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.726863 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1890dca-166e-42ff-8033-9a54ab2d4536-config\") pod \"dnsmasq-dns-6d84845cb9-47rpd\" (UID: \"e1890dca-166e-42ff-8033-9a54ab2d4536\") " pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.726969 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r978m\" (UniqueName: \"kubernetes.io/projected/e1890dca-166e-42ff-8033-9a54ab2d4536-kube-api-access-r978m\") pod \"dnsmasq-dns-6d84845cb9-47rpd\" (UID: \"e1890dca-166e-42ff-8033-9a54ab2d4536\") " pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.728212 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1890dca-166e-42ff-8033-9a54ab2d4536-config\") pod \"dnsmasq-dns-6d84845cb9-47rpd\" (UID: \"e1890dca-166e-42ff-8033-9a54ab2d4536\") " pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.730705 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-2f2lx"] Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.732156 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.735526 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.745417 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-2f2lx"] Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.765827 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r978m\" (UniqueName: \"kubernetes.io/projected/e1890dca-166e-42ff-8033-9a54ab2d4536-kube-api-access-r978m\") pod \"dnsmasq-dns-6d84845cb9-47rpd\" (UID: \"e1890dca-166e-42ff-8033-9a54ab2d4536\") " pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.828193 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-config\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.828259 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22qd\" (UniqueName: \"kubernetes.io/projected/b1d2e15f-1a31-40b1-ac27-448ee3d34723-kube-api-access-q22qd\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.828353 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.925778 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.929297 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q22qd\" (UniqueName: \"kubernetes.io/projected/b1d2e15f-1a31-40b1-ac27-448ee3d34723-kube-api-access-q22qd\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.929356 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.929426 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-config\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.930208 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.930329 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-config\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:10 crc kubenswrapper[4990]: I1003 10:00:10.949392 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22qd\" (UniqueName: \"kubernetes.io/projected/b1d2e15f-1a31-40b1-ac27-448ee3d34723-kube-api-access-q22qd\") pod \"dnsmasq-dns-8687b65d7f-2f2lx\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:11 crc kubenswrapper[4990]: I1003 10:00:11.051003 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:11 crc kubenswrapper[4990]: I1003 10:00:11.424892 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-47rpd"] Oct 03 10:00:11 crc kubenswrapper[4990]: I1003 10:00:11.433672 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:00:11 crc kubenswrapper[4990]: I1003 10:00:11.490210 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-2f2lx"] Oct 03 10:00:11 crc kubenswrapper[4990]: W1003 10:00:11.503399 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1d2e15f_1a31_40b1_ac27_448ee3d34723.slice/crio-827ac66e9b9a1bda4f9c98e78a602f52b53a1c97cf21a4f49642813aedc8d651 WatchSource:0}: Error finding container 827ac66e9b9a1bda4f9c98e78a602f52b53a1c97cf21a4f49642813aedc8d651: Status 404 returned error can't find the container with id 827ac66e9b9a1bda4f9c98e78a602f52b53a1c97cf21a4f49642813aedc8d651 Oct 03 10:00:11 crc kubenswrapper[4990]: I1003 10:00:11.847401 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" event={"ID":"b1d2e15f-1a31-40b1-ac27-448ee3d34723","Type":"ContainerStarted","Data":"827ac66e9b9a1bda4f9c98e78a602f52b53a1c97cf21a4f49642813aedc8d651"} Oct 03 10:00:11 crc kubenswrapper[4990]: I1003 10:00:11.848774 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" event={"ID":"e1890dca-166e-42ff-8033-9a54ab2d4536","Type":"ContainerStarted","Data":"ae3c644cbe76cd52c8cd2b0f2a436893401858a559d17fc64a832feef4adb2bb"} Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.365170 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-47rpd"] Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.397789 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b749bd587-w9mb4"] Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.398999 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.409652 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b749bd587-w9mb4"] Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.464870 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-config\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.464920 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-dns-svc\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.464942 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwpjx\" (UniqueName: \"kubernetes.io/projected/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-kube-api-access-rwpjx\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.566958 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-config\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.567008 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-dns-svc\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.567043 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwpjx\" (UniqueName: \"kubernetes.io/projected/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-kube-api-access-rwpjx\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.568255 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-config\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.568862 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-dns-svc\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.634869 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwpjx\" (UniqueName: \"kubernetes.io/projected/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-kube-api-access-rwpjx\") pod \"dnsmasq-dns-7b749bd587-w9mb4\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.713862 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-2f2lx"] Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.724672 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.760364 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-sbw4c"] Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.789105 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-sbw4c"] Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.789258 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.875584 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77js\" (UniqueName: \"kubernetes.io/projected/db03a21a-6575-489a-a9c8-cb6035363f12-kube-api-access-k77js\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.875644 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-dns-svc\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.875695 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-config\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.976833 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k77js\" (UniqueName: \"kubernetes.io/projected/db03a21a-6575-489a-a9c8-cb6035363f12-kube-api-access-k77js\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.976966 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-dns-svc\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.977099 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-config\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.977959 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-dns-svc\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:13 crc kubenswrapper[4990]: I1003 10:00:13.978647 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-config\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.022291 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77js\" (UniqueName: \"kubernetes.io/projected/db03a21a-6575-489a-a9c8-cb6035363f12-kube-api-access-k77js\") pod \"dnsmasq-dns-5cb7995759-sbw4c\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.131595 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.431453 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b749bd587-w9mb4"] Oct 03 10:00:14 crc kubenswrapper[4990]: W1003 10:00:14.446336 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ef01a2_2d06_4dd4_ae0e_53eac15d8500.slice/crio-ee476af2171e6aaf7a2b4234004143fb1b18438ce78b5efeeb255cd6b6a49505 WatchSource:0}: Error finding container ee476af2171e6aaf7a2b4234004143fb1b18438ce78b5efeeb255cd6b6a49505: Status 404 returned error can't find the container with id ee476af2171e6aaf7a2b4234004143fb1b18438ce78b5efeeb255cd6b6a49505 Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.559886 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.564221 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.572124 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.572271 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6vqvb" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.572337 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.572485 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.572643 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.572751 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.573273 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.573585 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.584824 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51461d28-e850-4ba3-8f27-0252b51903f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.584888 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.584930 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.584960 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wdc\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-kube-api-access-c5wdc\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.584987 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51461d28-e850-4ba3-8f27-0252b51903f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.585016 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.585051 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.585093 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.585243 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.585401 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.585557 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.682431 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-sbw4c"] Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.687031 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.687289 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.687335 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.687722 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.687873 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51461d28-e850-4ba3-8f27-0252b51903f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.687918 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.688072 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.688296 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wdc\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-kube-api-access-c5wdc\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.688334 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51461d28-e850-4ba3-8f27-0252b51903f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.688550 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.688588 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.689557 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.690970 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.692490 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.692878 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.693091 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.693341 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.696938 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51461d28-e850-4ba3-8f27-0252b51903f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: W1003 10:00:14.699128 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb03a21a_6575_489a_a9c8_cb6035363f12.slice/crio-8c6613d18696259d3fd1829af0e66ab63a348732d43e546bd1e445bdc63a8dcc WatchSource:0}: Error finding container 8c6613d18696259d3fd1829af0e66ab63a348732d43e546bd1e445bdc63a8dcc: Status 404 returned error can't find the container with id 8c6613d18696259d3fd1829af0e66ab63a348732d43e546bd1e445bdc63a8dcc Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.703433 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.705398 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51461d28-e850-4ba3-8f27-0252b51903f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.708171 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.712555 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wdc\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-kube-api-access-c5wdc\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.723693 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.887309 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.889346 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.889483 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.890003 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" event={"ID":"86ef01a2-2d06-4dd4-ae0e-53eac15d8500","Type":"ContainerStarted","Data":"ee476af2171e6aaf7a2b4234004143fb1b18438ce78b5efeeb255cd6b6a49505"} Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.891850 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6nl7j" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.891994 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.892136 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.892145 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.892284 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.892341 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.892993 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.894085 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" event={"ID":"db03a21a-6575-489a-a9c8-cb6035363f12","Type":"ContainerStarted","Data":"8c6613d18696259d3fd1829af0e66ab63a348732d43e546bd1e445bdc63a8dcc"} Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.910529 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.993689 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.993769 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6624a04-5ca4-4651-a91e-0a67f97c51b5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.993795 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.993822 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.993857 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.993942 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.993977 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdd4\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-kube-api-access-4mdd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.994077 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.994096 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.994199 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:14 crc kubenswrapper[4990]: I1003 10:00:14.994224 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6624a04-5ca4-4651-a91e-0a67f97c51b5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.098706 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099158 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099178 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6624a04-5ca4-4651-a91e-0a67f97c51b5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099200 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099220 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099254 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099281 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdd4\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-kube-api-access-4mdd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099307 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099325 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099360 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.099385 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6624a04-5ca4-4651-a91e-0a67f97c51b5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.100964 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.102150 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.100965 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.103270 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.103584 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.106926 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6624a04-5ca4-4651-a91e-0a67f97c51b5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.108278 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.116979 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6624a04-5ca4-4651-a91e-0a67f97c51b5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.117439 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.124071 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.131895 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdd4\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-kube-api-access-4mdd4\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.152399 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.212640 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.398927 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 10:00:15 crc kubenswrapper[4990]: W1003 10:00:15.434830 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51461d28_e850_4ba3_8f27_0252b51903f1.slice/crio-f8342d53d9a73416382f9c581e30435c715537b9887674feb92ac0b3b7083a77 WatchSource:0}: Error finding container f8342d53d9a73416382f9c581e30435c715537b9887674feb92ac0b3b7083a77: Status 404 returned error can't find the container with id f8342d53d9a73416382f9c581e30435c715537b9887674feb92ac0b3b7083a77 Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.727425 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 10:00:15 crc kubenswrapper[4990]: I1003 10:00:15.913166 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51461d28-e850-4ba3-8f27-0252b51903f1","Type":"ContainerStarted","Data":"f8342d53d9a73416382f9c581e30435c715537b9887674feb92ac0b3b7083a77"} Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.355591 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.363281 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.374566 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7qtmh" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.374905 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.375063 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.375201 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.375941 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.394942 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.412622 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530424 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gskbm\" (UniqueName: \"kubernetes.io/projected/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kube-api-access-gskbm\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530491 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530550 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530591 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530632 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-secrets\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530661 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530717 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530758 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.530783 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.631710 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632217 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632260 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-secrets\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632289 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632341 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632379 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632401 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632458 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gskbm\" (UniqueName: \"kubernetes.io/projected/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kube-api-access-gskbm\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632487 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.632898 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.634214 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.634895 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.635770 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.637211 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.644855 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-secrets\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.645665 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.658137 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.669039 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gskbm\" (UniqueName: \"kubernetes.io/projected/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kube-api-access-gskbm\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.689570 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " pod="openstack/openstack-galera-0" Oct 03 10:00:16 crc kubenswrapper[4990]: I1003 10:00:16.702384 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.385952 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.387400 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.389835 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.389855 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.390049 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-vdhjf" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.393965 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.400040 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.549710 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.549753 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.549799 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.549868 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.549905 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.549945 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwpj\" (UniqueName: \"kubernetes.io/projected/16a22247-2803-4910-a44a-9ccba673c2cf-kube-api-access-gzwpj\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.549969 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.549991 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.550015 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652466 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwpj\" (UniqueName: \"kubernetes.io/projected/16a22247-2803-4910-a44a-9ccba673c2cf-kube-api-access-gzwpj\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652547 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652575 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652609 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652638 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652660 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652694 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652719 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.652751 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.657729 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.659324 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.659719 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.660261 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.660791 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.662501 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.662822 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.673370 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.681809 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwpj\" (UniqueName: \"kubernetes.io/projected/16a22247-2803-4910-a44a-9ccba673c2cf-kube-api-access-gzwpj\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.706650 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.712822 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.875543 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.877351 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.879652 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-59jhh" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.879919 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.880201 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.887528 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.957451 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrrk\" (UniqueName: \"kubernetes.io/projected/230b4581-35e6-4c97-9f63-73e70624bf5c-kube-api-access-hcrrk\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.957869 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-config-data\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.958404 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.958487 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:17 crc kubenswrapper[4990]: I1003 10:00:17.958613 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-kolla-config\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.060378 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-kolla-config\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.061553 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-kolla-config\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.061716 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrrk\" (UniqueName: \"kubernetes.io/projected/230b4581-35e6-4c97-9f63-73e70624bf5c-kube-api-access-hcrrk\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.061810 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-config-data\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.062358 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-config-data\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.062689 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.063309 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.075158 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.075812 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.094602 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrrk\" (UniqueName: \"kubernetes.io/projected/230b4581-35e6-4c97-9f63-73e70624bf5c-kube-api-access-hcrrk\") pod \"memcached-0\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " pod="openstack/memcached-0" Oct 03 10:00:18 crc kubenswrapper[4990]: I1003 10:00:18.199702 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.636173 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.638180 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.641116 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-tjn5l" Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.650741 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.795905 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msrkd\" (UniqueName: \"kubernetes.io/projected/78a0dde5-4d0a-49a7-b9b1-081a994a41da-kube-api-access-msrkd\") pod \"kube-state-metrics-0\" (UID: \"78a0dde5-4d0a-49a7-b9b1-081a994a41da\") " pod="openstack/kube-state-metrics-0" Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.897719 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msrkd\" (UniqueName: \"kubernetes.io/projected/78a0dde5-4d0a-49a7-b9b1-081a994a41da-kube-api-access-msrkd\") pod \"kube-state-metrics-0\" (UID: \"78a0dde5-4d0a-49a7-b9b1-081a994a41da\") " pod="openstack/kube-state-metrics-0" Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.928463 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msrkd\" (UniqueName: \"kubernetes.io/projected/78a0dde5-4d0a-49a7-b9b1-081a994a41da-kube-api-access-msrkd\") pod \"kube-state-metrics-0\" (UID: \"78a0dde5-4d0a-49a7-b9b1-081a994a41da\") " pod="openstack/kube-state-metrics-0" Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.966466 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 10:00:19 crc kubenswrapper[4990]: I1003 10:00:19.980927 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6624a04-5ca4-4651-a91e-0a67f97c51b5","Type":"ContainerStarted","Data":"1ce8535c7e0fcadee3e8f2222a51e30a7d30339d8968e3832a95ef56d642fb00"} Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.399500 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nfzkg"] Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.400843 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.404753 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.404934 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.405043 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-n6zh4" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.414419 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfzkg"] Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.437643 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zxxk7"] Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.439955 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.469485 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zxxk7"] Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.481638 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6d51bd5-1a8f-402d-80e1-441872e15719-scripts\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.481923 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-log-ovn\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.482087 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-combined-ca-bundle\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.482247 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-ovn-controller-tls-certs\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.482419 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run-ovn\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.482559 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gr5p\" (UniqueName: \"kubernetes.io/projected/a6d51bd5-1a8f-402d-80e1-441872e15719-kube-api-access-2gr5p\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.482695 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585170 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6d51bd5-1a8f-402d-80e1-441872e15719-scripts\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585232 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-log-ovn\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585263 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-combined-ca-bundle\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585302 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-run\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585341 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-ovn-controller-tls-certs\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585366 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-log\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585403 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-etc-ovs\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585438 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-lib\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585459 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run-ovn\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585477 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gr5p\" (UniqueName: \"kubernetes.io/projected/a6d51bd5-1a8f-402d-80e1-441872e15719-kube-api-access-2gr5p\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585571 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585602 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea0bd28b-825b-4ba5-8838-f3bc695b0613-scripts\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.585640 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t82d2\" (UniqueName: \"kubernetes.io/projected/ea0bd28b-825b-4ba5-8838-f3bc695b0613-kube-api-access-t82d2\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.588022 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-log-ovn\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.588064 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run-ovn\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.588169 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.594269 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-combined-ca-bundle\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.596815 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6d51bd5-1a8f-402d-80e1-441872e15719-scripts\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.597387 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-ovn-controller-tls-certs\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.602836 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gr5p\" (UniqueName: \"kubernetes.io/projected/a6d51bd5-1a8f-402d-80e1-441872e15719-kube-api-access-2gr5p\") pod \"ovn-controller-nfzkg\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687200 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-log\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687271 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-etc-ovs\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687302 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-lib\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687342 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea0bd28b-825b-4ba5-8838-f3bc695b0613-scripts\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687358 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t82d2\" (UniqueName: \"kubernetes.io/projected/ea0bd28b-825b-4ba5-8838-f3bc695b0613-kube-api-access-t82d2\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687407 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-run\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687414 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-log\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687537 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-run\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687625 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-lib\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.687800 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-etc-ovs\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.689460 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea0bd28b-825b-4ba5-8838-f3bc695b0613-scripts\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.714259 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t82d2\" (UniqueName: \"kubernetes.io/projected/ea0bd28b-825b-4ba5-8838-f3bc695b0613-kube-api-access-t82d2\") pod \"ovn-controller-ovs-zxxk7\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.747413 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:24 crc kubenswrapper[4990]: I1003 10:00:24.758609 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.312059 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.313775 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.317221 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ntcgb" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.320118 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.320479 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.320726 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.320996 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.358449 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.398904 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.398998 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.399068 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz2px\" (UniqueName: \"kubernetes.io/projected/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-kube-api-access-zz2px\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.399088 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-config\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.399105 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.399168 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.399190 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.399264 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.501312 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.501361 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.501412 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.501459 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.501477 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.501504 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz2px\" (UniqueName: \"kubernetes.io/projected/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-kube-api-access-zz2px\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.501535 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-config\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.501556 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.504781 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.504945 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.504970 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.505343 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.505559 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-config\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.505941 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.515788 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.526288 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz2px\" (UniqueName: \"kubernetes.io/projected/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-kube-api-access-zz2px\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.533445 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:25 crc kubenswrapper[4990]: I1003 10:00:25.647197 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.609303 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.611605 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.614495 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.614705 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.615647 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9sh88" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.616222 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.623155 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.722915 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.722982 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.723111 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.723144 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.723160 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.724184 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.724256 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x47v\" (UniqueName: \"kubernetes.io/projected/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-kube-api-access-7x47v\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.724287 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.825793 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.826642 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.826801 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.826967 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.827088 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.827130 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.827167 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.827248 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x47v\" (UniqueName: \"kubernetes.io/projected/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-kube-api-access-7x47v\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.827290 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.828675 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-config\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.830311 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.830357 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.831880 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.832815 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.838863 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.843701 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x47v\" (UniqueName: \"kubernetes.io/projected/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-kube-api-access-7x47v\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.850693 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:26 crc kubenswrapper[4990]: I1003 10:00:26.941540 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.281777 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.282394 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q22qd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8687b65d7f-2f2lx_openstack(b1d2e15f-1a31-40b1-ac27-448ee3d34723): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.283557 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" podUID="b1d2e15f-1a31-40b1-ac27-448ee3d34723" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.288734 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.288917 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwpjx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7b749bd587-w9mb4_openstack(86ef01a2-2d06-4dd4-ae0e-53eac15d8500): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.290003 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" podUID="86ef01a2-2d06-4dd4-ae0e-53eac15d8500" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.311045 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.311345 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k77js,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5cb7995759-sbw4c_openstack(db03a21a-6575-489a-a9c8-cb6035363f12): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.312707 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.329201 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.329441 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r978m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6d84845cb9-47rpd_openstack(e1890dca-166e-42ff-8033-9a54ab2d4536): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 10:00:33 crc kubenswrapper[4990]: E1003 10:00:33.330862 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" podUID="e1890dca-166e-42ff-8033-9a54ab2d4536" Oct 03 10:00:33 crc kubenswrapper[4990]: I1003 10:00:33.707286 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 10:00:34 crc kubenswrapper[4990]: E1003 10:00:34.098697 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5\\\"\"" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" podUID="86ef01a2-2d06-4dd4-ae0e-53eac15d8500" Oct 03 10:00:34 crc kubenswrapper[4990]: E1003 10:00:34.098850 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5\\\"\"" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" Oct 03 10:00:34 crc kubenswrapper[4990]: W1003 10:00:34.502490 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod230b4581_35e6_4c97_9f63_73e70624bf5c.slice/crio-795218cff2f7ef7fe86b2383573d93567b8f0f30d0236156bd11d2b7f2240923 WatchSource:0}: Error finding container 795218cff2f7ef7fe86b2383573d93567b8f0f30d0236156bd11d2b7f2240923: Status 404 returned error can't find the container with id 795218cff2f7ef7fe86b2383573d93567b8f0f30d0236156bd11d2b7f2240923 Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.673760 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.681991 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.788127 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-config\") pod \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.788278 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r978m\" (UniqueName: \"kubernetes.io/projected/e1890dca-166e-42ff-8033-9a54ab2d4536-kube-api-access-r978m\") pod \"e1890dca-166e-42ff-8033-9a54ab2d4536\" (UID: \"e1890dca-166e-42ff-8033-9a54ab2d4536\") " Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.788303 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q22qd\" (UniqueName: \"kubernetes.io/projected/b1d2e15f-1a31-40b1-ac27-448ee3d34723-kube-api-access-q22qd\") pod \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.788340 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-dns-svc\") pod \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\" (UID: \"b1d2e15f-1a31-40b1-ac27-448ee3d34723\") " Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.788415 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1890dca-166e-42ff-8033-9a54ab2d4536-config\") pod \"e1890dca-166e-42ff-8033-9a54ab2d4536\" (UID: \"e1890dca-166e-42ff-8033-9a54ab2d4536\") " Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.788632 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-config" (OuterVolumeSpecName: "config") pod "b1d2e15f-1a31-40b1-ac27-448ee3d34723" (UID: "b1d2e15f-1a31-40b1-ac27-448ee3d34723"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.788887 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.789064 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1d2e15f-1a31-40b1-ac27-448ee3d34723" (UID: "b1d2e15f-1a31-40b1-ac27-448ee3d34723"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.789252 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1890dca-166e-42ff-8033-9a54ab2d4536-config" (OuterVolumeSpecName: "config") pod "e1890dca-166e-42ff-8033-9a54ab2d4536" (UID: "e1890dca-166e-42ff-8033-9a54ab2d4536"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.792536 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1890dca-166e-42ff-8033-9a54ab2d4536-kube-api-access-r978m" (OuterVolumeSpecName: "kube-api-access-r978m") pod "e1890dca-166e-42ff-8033-9a54ab2d4536" (UID: "e1890dca-166e-42ff-8033-9a54ab2d4536"). InnerVolumeSpecName "kube-api-access-r978m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.793155 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d2e15f-1a31-40b1-ac27-448ee3d34723-kube-api-access-q22qd" (OuterVolumeSpecName: "kube-api-access-q22qd") pod "b1d2e15f-1a31-40b1-ac27-448ee3d34723" (UID: "b1d2e15f-1a31-40b1-ac27-448ee3d34723"). InnerVolumeSpecName "kube-api-access-q22qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.890052 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1890dca-166e-42ff-8033-9a54ab2d4536-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.890301 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r978m\" (UniqueName: \"kubernetes.io/projected/e1890dca-166e-42ff-8033-9a54ab2d4536-kube-api-access-r978m\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.890316 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q22qd\" (UniqueName: \"kubernetes.io/projected/b1d2e15f-1a31-40b1-ac27-448ee3d34723-kube-api-access-q22qd\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:34 crc kubenswrapper[4990]: I1003 10:00:34.890326 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1d2e15f-1a31-40b1-ac27-448ee3d34723-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:34.999973 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.015489 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 10:00:35 crc kubenswrapper[4990]: W1003 10:00:35.023126 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fe31a60_7e5f_40a8_acf3_d7a17c210e74.slice/crio-21b2eb22da5beaa5673b813cddc311554ce933b5161e40ba8798952257c54e8d WatchSource:0}: Error finding container 21b2eb22da5beaa5673b813cddc311554ce933b5161e40ba8798952257c54e8d: Status 404 returned error can't find the container with id 21b2eb22da5beaa5673b813cddc311554ce933b5161e40ba8798952257c54e8d Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.121406 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16a22247-2803-4910-a44a-9ccba673c2cf","Type":"ContainerStarted","Data":"6714afbbfaa0582c607dde4a23d8cd684e969ee6fbc2cfc7c5e1f572624c145d"} Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.124292 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.124456 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8687b65d7f-2f2lx" event={"ID":"b1d2e15f-1a31-40b1-ac27-448ee3d34723","Type":"ContainerDied","Data":"827ac66e9b9a1bda4f9c98e78a602f52b53a1c97cf21a4f49642813aedc8d651"} Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.126279 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fe31a60-7e5f-40a8-acf3-d7a17c210e74","Type":"ContainerStarted","Data":"21b2eb22da5beaa5673b813cddc311554ce933b5161e40ba8798952257c54e8d"} Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.132118 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" event={"ID":"e1890dca-166e-42ff-8033-9a54ab2d4536","Type":"ContainerDied","Data":"ae3c644cbe76cd52c8cd2b0f2a436893401858a559d17fc64a832feef4adb2bb"} Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.133446 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-47rpd" Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.136952 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"230b4581-35e6-4c97-9f63-73e70624bf5c","Type":"ContainerStarted","Data":"795218cff2f7ef7fe86b2383573d93567b8f0f30d0236156bd11d2b7f2240923"} Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.154906 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.175578 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfzkg"] Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.191542 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-2f2lx"] Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.197861 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-2f2lx"] Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.217852 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-47rpd"] Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.225766 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-47rpd"] Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.247561 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 10:00:35 crc kubenswrapper[4990]: I1003 10:00:35.450648 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zxxk7"] Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.056694 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.144686 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg" event={"ID":"a6d51bd5-1a8f-402d-80e1-441872e15719","Type":"ContainerStarted","Data":"d7222243b011d34e7019fc28eec4f33e7fd3717826982e1e441a5496ec8778e8"} Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.145932 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"78a0dde5-4d0a-49a7-b9b1-081a994a41da","Type":"ContainerStarted","Data":"1d9dd5c39ce872d5fb5a138548b1524aa4b99ee45492e2756bfb4822f56237e5"} Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.147652 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6624a04-5ca4-4651-a91e-0a67f97c51b5","Type":"ContainerStarted","Data":"46a2006f531297eb507c3d080522e05f935cfe53f8d27382af0ef0806a9315a1"} Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.150923 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zxxk7" event={"ID":"ea0bd28b-825b-4ba5-8838-f3bc695b0613","Type":"ContainerStarted","Data":"e861d179a141cd79e03a597643e1d9fe8963fcddf557ddb8cf8f6c8c663a5516"} Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.152720 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e","Type":"ContainerStarted","Data":"52be642a2c45cc7f0625fa92fcdfe338f57666d0243ebafce92d0592d6b6a529"} Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.154466 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51461d28-e850-4ba3-8f27-0252b51903f1","Type":"ContainerStarted","Data":"b4d7bb564bea84a147b666614fdc109dbffb801e15bd785bda000207cddae019"} Oct 03 10:00:36 crc kubenswrapper[4990]: W1003 10:00:36.534771 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaaf61b7_7fcf_40c6_91c4_56ed9720cdf9.slice/crio-8a0ede082177925400825a20468a7f0674e16bf187f695f71d978a0215d1e517 WatchSource:0}: Error finding container 8a0ede082177925400825a20468a7f0674e16bf187f695f71d978a0215d1e517: Status 404 returned error can't find the container with id 8a0ede082177925400825a20468a7f0674e16bf187f695f71d978a0215d1e517 Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.887950 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d2e15f-1a31-40b1-ac27-448ee3d34723" path="/var/lib/kubelet/pods/b1d2e15f-1a31-40b1-ac27-448ee3d34723/volumes" Oct 03 10:00:36 crc kubenswrapper[4990]: I1003 10:00:36.889593 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1890dca-166e-42ff-8033-9a54ab2d4536" path="/var/lib/kubelet/pods/e1890dca-166e-42ff-8033-9a54ab2d4536/volumes" Oct 03 10:00:37 crc kubenswrapper[4990]: I1003 10:00:37.163741 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9","Type":"ContainerStarted","Data":"8a0ede082177925400825a20468a7f0674e16bf187f695f71d978a0215d1e517"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.208277 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e","Type":"ContainerStarted","Data":"2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.211668 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg" event={"ID":"a6d51bd5-1a8f-402d-80e1-441872e15719","Type":"ContainerStarted","Data":"626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.211883 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nfzkg" Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.213802 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fe31a60-7e5f-40a8-acf3-d7a17c210e74","Type":"ContainerStarted","Data":"6304bb47401c2e416b699ae72b910fc4eb114553fd6d296ac55983e6561263e9"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.217089 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9","Type":"ContainerStarted","Data":"8d47e9542f5010045e9f1e66e2c4ba83ad379324fa4cf15c59ee6b012997be8d"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.218611 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"230b4581-35e6-4c97-9f63-73e70624bf5c","Type":"ContainerStarted","Data":"8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.218821 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.220636 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"78a0dde5-4d0a-49a7-b9b1-081a994a41da","Type":"ContainerStarted","Data":"0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.221466 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.222747 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zxxk7" event={"ID":"ea0bd28b-825b-4ba5-8838-f3bc695b0613","Type":"ContainerStarted","Data":"7458a9d1d574a85e34d93996629001a19ee6d5414dafa8bf5462a2fecc3238db"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.225085 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16a22247-2803-4910-a44a-9ccba673c2cf","Type":"ContainerStarted","Data":"78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2"} Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.232933 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nfzkg" podStartSLOduration=11.702055874 podStartE2EDuration="19.232915924s" podCreationTimestamp="2025-10-03 10:00:24 +0000 UTC" firstStartedPulling="2025-10-03 10:00:35.157520429 +0000 UTC m=+1016.954152276" lastFinishedPulling="2025-10-03 10:00:42.688380479 +0000 UTC m=+1024.485012326" observedRunningTime="2025-10-03 10:00:43.227991773 +0000 UTC m=+1025.024623640" watchObservedRunningTime="2025-10-03 10:00:43.232915924 +0000 UTC m=+1025.029547781" Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.283227 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.373052753 podStartE2EDuration="26.283206647s" podCreationTimestamp="2025-10-03 10:00:17 +0000 UTC" firstStartedPulling="2025-10-03 10:00:34.527078947 +0000 UTC m=+1016.323710804" lastFinishedPulling="2025-10-03 10:00:37.437232841 +0000 UTC m=+1019.233864698" observedRunningTime="2025-10-03 10:00:43.261149276 +0000 UTC m=+1025.057781133" watchObservedRunningTime="2025-10-03 10:00:43.283206647 +0000 UTC m=+1025.079838504" Oct 03 10:00:43 crc kubenswrapper[4990]: I1003 10:00:43.320887 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.743309336 podStartE2EDuration="24.320867861s" podCreationTimestamp="2025-10-03 10:00:19 +0000 UTC" firstStartedPulling="2025-10-03 10:00:35.137532939 +0000 UTC m=+1016.934164796" lastFinishedPulling="2025-10-03 10:00:42.715091464 +0000 UTC m=+1024.511723321" observedRunningTime="2025-10-03 10:00:43.305882933 +0000 UTC m=+1025.102514780" watchObservedRunningTime="2025-10-03 10:00:43.320867861 +0000 UTC m=+1025.117499718" Oct 03 10:00:44 crc kubenswrapper[4990]: I1003 10:00:44.239699 4990 generic.go:334] "Generic (PLEG): container finished" podID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerID="7458a9d1d574a85e34d93996629001a19ee6d5414dafa8bf5462a2fecc3238db" exitCode=0 Oct 03 10:00:44 crc kubenswrapper[4990]: I1003 10:00:44.240035 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zxxk7" event={"ID":"ea0bd28b-825b-4ba5-8838-f3bc695b0613","Type":"ContainerDied","Data":"7458a9d1d574a85e34d93996629001a19ee6d5414dafa8bf5462a2fecc3238db"} Oct 03 10:00:45 crc kubenswrapper[4990]: I1003 10:00:45.250870 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zxxk7" event={"ID":"ea0bd28b-825b-4ba5-8838-f3bc695b0613","Type":"ContainerStarted","Data":"554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7"} Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.270075 4990 generic.go:334] "Generic (PLEG): container finished" podID="db03a21a-6575-489a-a9c8-cb6035363f12" containerID="5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a" exitCode=0 Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.270169 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" event={"ID":"db03a21a-6575-489a-a9c8-cb6035363f12","Type":"ContainerDied","Data":"5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a"} Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.274995 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zxxk7" event={"ID":"ea0bd28b-825b-4ba5-8838-f3bc695b0613","Type":"ContainerStarted","Data":"7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a"} Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.275228 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.275270 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.279003 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e","Type":"ContainerStarted","Data":"78d0f733b310c1f6130bc24b6979ef796f0479532e62ab932da054c013922271"} Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.280822 4990 generic.go:334] "Generic (PLEG): container finished" podID="16a22247-2803-4910-a44a-9ccba673c2cf" containerID="78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2" exitCode=0 Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.280929 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16a22247-2803-4910-a44a-9ccba673c2cf","Type":"ContainerDied","Data":"78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2"} Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.293572 4990 generic.go:334] "Generic (PLEG): container finished" podID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" containerID="6304bb47401c2e416b699ae72b910fc4eb114553fd6d296ac55983e6561263e9" exitCode=0 Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.293614 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fe31a60-7e5f-40a8-acf3-d7a17c210e74","Type":"ContainerDied","Data":"6304bb47401c2e416b699ae72b910fc4eb114553fd6d296ac55983e6561263e9"} Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.296238 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9","Type":"ContainerStarted","Data":"98c8679f7b257dccb8f8856102144602d80294d3dcc4917701e8537d48ac3f47"} Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.361271 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.791711018 podStartE2EDuration="22.361247234s" podCreationTimestamp="2025-10-03 10:00:25 +0000 UTC" firstStartedPulling="2025-10-03 10:00:35.486679212 +0000 UTC m=+1017.283311069" lastFinishedPulling="2025-10-03 10:00:46.056215438 +0000 UTC m=+1027.852847285" observedRunningTime="2025-10-03 10:00:47.346215415 +0000 UTC m=+1029.142847272" watchObservedRunningTime="2025-10-03 10:00:47.361247234 +0000 UTC m=+1029.157879091" Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.403381 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.914863584 podStartE2EDuration="23.403361117s" podCreationTimestamp="2025-10-03 10:00:24 +0000 UTC" firstStartedPulling="2025-10-03 10:00:36.536580691 +0000 UTC m=+1018.333212548" lastFinishedPulling="2025-10-03 10:00:46.025078224 +0000 UTC m=+1027.821710081" observedRunningTime="2025-10-03 10:00:47.386865662 +0000 UTC m=+1029.183497529" watchObservedRunningTime="2025-10-03 10:00:47.403361117 +0000 UTC m=+1029.199992974" Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.404050 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zxxk7" podStartSLOduration=16.73292147 podStartE2EDuration="23.404043904s" podCreationTimestamp="2025-10-03 10:00:24 +0000 UTC" firstStartedPulling="2025-10-03 10:00:35.485548855 +0000 UTC m=+1017.282180712" lastFinishedPulling="2025-10-03 10:00:42.156671289 +0000 UTC m=+1023.953303146" observedRunningTime="2025-10-03 10:00:47.369770823 +0000 UTC m=+1029.166402700" watchObservedRunningTime="2025-10-03 10:00:47.404043904 +0000 UTC m=+1029.200675761" Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.942185 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:47 crc kubenswrapper[4990]: I1003 10:00:47.994467 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.201133 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.305345 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" event={"ID":"db03a21a-6575-489a-a9c8-cb6035363f12","Type":"ContainerStarted","Data":"720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c"} Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.306308 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.309557 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16a22247-2803-4910-a44a-9ccba673c2cf","Type":"ContainerStarted","Data":"daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22"} Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.312813 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fe31a60-7e5f-40a8-acf3-d7a17c210e74","Type":"ContainerStarted","Data":"72084d24cc256a164d380470d3a517d6d49179f56ce62666893f00f00964d3bf"} Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.313874 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.324692 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" podStartSLOduration=3.660471705 podStartE2EDuration="35.324672213s" podCreationTimestamp="2025-10-03 10:00:13 +0000 UTC" firstStartedPulling="2025-10-03 10:00:14.701171392 +0000 UTC m=+996.497803249" lastFinishedPulling="2025-10-03 10:00:46.3653719 +0000 UTC m=+1028.162003757" observedRunningTime="2025-10-03 10:00:48.323904794 +0000 UTC m=+1030.120536671" watchObservedRunningTime="2025-10-03 10:00:48.324672213 +0000 UTC m=+1030.121304080" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.343798 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.719909421 podStartE2EDuration="32.343779992s" podCreationTimestamp="2025-10-03 10:00:16 +0000 UTC" firstStartedPulling="2025-10-03 10:00:35.017741581 +0000 UTC m=+1016.814373438" lastFinishedPulling="2025-10-03 10:00:42.641612152 +0000 UTC m=+1024.438244009" observedRunningTime="2025-10-03 10:00:48.340263525 +0000 UTC m=+1030.136895392" watchObservedRunningTime="2025-10-03 10:00:48.343779992 +0000 UTC m=+1030.140411849" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.354602 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.365177 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.233287991 podStartE2EDuration="33.365163906s" podCreationTimestamp="2025-10-03 10:00:15 +0000 UTC" firstStartedPulling="2025-10-03 10:00:35.02464254 +0000 UTC m=+1016.821274407" lastFinishedPulling="2025-10-03 10:00:42.156518465 +0000 UTC m=+1023.953150322" observedRunningTime="2025-10-03 10:00:48.363930296 +0000 UTC m=+1030.160562153" watchObservedRunningTime="2025-10-03 10:00:48.365163906 +0000 UTC m=+1030.161795763" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.675885 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-sbw4c"] Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.699851 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc9bfff7c-2dtbh"] Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.701694 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.709602 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.720707 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc9bfff7c-2dtbh"] Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.738012 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-68zd7"] Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.741690 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.743789 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.762884 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-68zd7"] Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.872195 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-dns-svc\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.872496 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbcmn\" (UniqueName: \"kubernetes.io/projected/dfd0ab25-3469-4914-a63a-e38533850c0f-kube-api-access-dbcmn\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.872607 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-config\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.872701 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovn-rundir\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.872818 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55jhr\" (UniqueName: \"kubernetes.io/projected/463798bd-8799-4206-bf0c-b2f62f1fc1d0-kube-api-access-55jhr\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.872911 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463798bd-8799-4206-bf0c-b2f62f1fc1d0-config\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.873006 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.873091 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-combined-ca-bundle\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.873172 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovs-rundir\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.873270 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.975313 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovn-rundir\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.976353 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55jhr\" (UniqueName: \"kubernetes.io/projected/463798bd-8799-4206-bf0c-b2f62f1fc1d0-kube-api-access-55jhr\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.976479 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463798bd-8799-4206-bf0c-b2f62f1fc1d0-config\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.976574 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.976622 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-combined-ca-bundle\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.976674 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovs-rundir\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.976686 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovn-rundir\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.976763 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.976914 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-dns-svc\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.977007 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbcmn\" (UniqueName: \"kubernetes.io/projected/dfd0ab25-3469-4914-a63a-e38533850c0f-kube-api-access-dbcmn\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.977078 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-config\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.977364 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovs-rundir\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.978066 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463798bd-8799-4206-bf0c-b2f62f1fc1d0-config\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.978580 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.978672 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-config\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.979761 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-dns-svc\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.983258 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-combined-ca-bundle\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:48 crc kubenswrapper[4990]: I1003 10:00:48.983580 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.000275 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55jhr\" (UniqueName: \"kubernetes.io/projected/463798bd-8799-4206-bf0c-b2f62f1fc1d0-kube-api-access-55jhr\") pod \"ovn-controller-metrics-68zd7\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.000505 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbcmn\" (UniqueName: \"kubernetes.io/projected/dfd0ab25-3469-4914-a63a-e38533850c0f-kube-api-access-dbcmn\") pod \"dnsmasq-dns-6dc9bfff7c-2dtbh\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.038270 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.064621 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.066951 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b749bd587-w9mb4"] Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.117362 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-lr2jj"] Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.122640 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.130328 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-lr2jj"] Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.136163 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.281388 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-sb\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.281759 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xkwp\" (UniqueName: \"kubernetes.io/projected/768e70f6-3a6f-43ef-93a1-26750658f735-kube-api-access-4xkwp\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.282015 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-nb\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.282074 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-config\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.282337 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-dns-svc\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.383355 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xkwp\" (UniqueName: \"kubernetes.io/projected/768e70f6-3a6f-43ef-93a1-26750658f735-kube-api-access-4xkwp\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.383426 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-nb\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.383458 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-config\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.383568 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-dns-svc\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.383658 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-sb\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.384612 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-sb\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.385455 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-nb\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.386196 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-config\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.387866 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-dns-svc\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.407756 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xkwp\" (UniqueName: \"kubernetes.io/projected/768e70f6-3a6f-43ef-93a1-26750658f735-kube-api-access-4xkwp\") pod \"dnsmasq-dns-75898fdcf9-lr2jj\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.573925 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.647919 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.670215 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc9bfff7c-2dtbh"] Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.708099 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.736710 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-68zd7"] Oct 03 10:00:49 crc kubenswrapper[4990]: I1003 10:00:49.979726 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.006330 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc9bfff7c-2dtbh"] Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.066597 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-lr2jj"] Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.086646 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-4t8d6"] Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.088924 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.130649 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-4t8d6"] Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.238652 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-nb\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.239127 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-sb\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.239203 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-dns-svc\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.239258 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsbsc\" (UniqueName: \"kubernetes.io/projected/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-kube-api-access-tsbsc\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.239379 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-config\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.336147 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" event={"ID":"768e70f6-3a6f-43ef-93a1-26750658f735","Type":"ContainerStarted","Data":"c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e"} Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.336233 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" event={"ID":"768e70f6-3a6f-43ef-93a1-26750658f735","Type":"ContainerStarted","Data":"9354619f37e7ad4d5752c1ac636656c4d5a3bab86b321519176a1dba70c85342"} Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.338733 4990 generic.go:334] "Generic (PLEG): container finished" podID="dfd0ab25-3469-4914-a63a-e38533850c0f" containerID="89b6ed3ecf90b015c7777f57abfa512b2589fe07b30b0c80f5f648cc07d56c95" exitCode=0 Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.338855 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" event={"ID":"dfd0ab25-3469-4914-a63a-e38533850c0f","Type":"ContainerDied","Data":"89b6ed3ecf90b015c7777f57abfa512b2589fe07b30b0c80f5f648cc07d56c95"} Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.338908 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" event={"ID":"dfd0ab25-3469-4914-a63a-e38533850c0f","Type":"ContainerStarted","Data":"b3c7bef025844f505516a448f3edd02578111b994b29cb1b6a38cb947e6eeec3"} Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.340320 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsbsc\" (UniqueName: \"kubernetes.io/projected/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-kube-api-access-tsbsc\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.340359 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-config\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.340450 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-nb\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.340470 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-sb\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.340538 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-dns-svc\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.341772 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-config\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.342178 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-dns-svc\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.342443 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-sb\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.343049 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-nb\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.369328 4990 generic.go:334] "Generic (PLEG): container finished" podID="86ef01a2-2d06-4dd4-ae0e-53eac15d8500" containerID="87e847712ce89f6b0b8967bd83de7f499a15c4e274b6828a73d1f47baa1d42c5" exitCode=0 Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.369587 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" event={"ID":"86ef01a2-2d06-4dd4-ae0e-53eac15d8500","Type":"ContainerDied","Data":"87e847712ce89f6b0b8967bd83de7f499a15c4e274b6828a73d1f47baa1d42c5"} Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.371785 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsbsc\" (UniqueName: \"kubernetes.io/projected/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-kube-api-access-tsbsc\") pod \"dnsmasq-dns-69d4d5cdc5-4t8d6\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.441163 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" containerName="dnsmasq-dns" containerID="cri-o://720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c" gracePeriod=10 Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.442886 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-68zd7" event={"ID":"463798bd-8799-4206-bf0c-b2f62f1fc1d0","Type":"ContainerStarted","Data":"2f089b7e26c6c70346708ab3abe1a9903c0d6d3655a4a9350c20b8a22252b418"} Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.442930 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-68zd7" event={"ID":"463798bd-8799-4206-bf0c-b2f62f1fc1d0","Type":"ContainerStarted","Data":"a75c9c338529a422f529a516561d3b80f75053c6e8fca4729272fbe4fc46be90"} Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.443958 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.489349 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.557289 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-68zd7" podStartSLOduration=2.557270138 podStartE2EDuration="2.557270138s" podCreationTimestamp="2025-10-03 10:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:00:50.522122836 +0000 UTC m=+1032.318754693" watchObservedRunningTime="2025-10-03 10:00:50.557270138 +0000 UTC m=+1032.353901995" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.698110 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 10:00:50 crc kubenswrapper[4990]: E1003 10:00:50.834162 4990 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 03 10:00:50 crc kubenswrapper[4990]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/dfd0ab25-3469-4914-a63a-e38533850c0f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 10:00:50 crc kubenswrapper[4990]: > podSandboxID="b3c7bef025844f505516a448f3edd02578111b994b29cb1b6a38cb947e6eeec3" Oct 03 10:00:50 crc kubenswrapper[4990]: E1003 10:00:50.834749 4990 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 10:00:50 crc kubenswrapper[4990]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch647h5fdh676h5c8h566h96h5d8hdh569h64dh5b5h587h55h5cch58dh658h67h5f6h64fh648h6h59fh65ch7hf9hf6h74hf8hch596h5b8q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dbcmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6dc9bfff7c-2dtbh_openstack(dfd0ab25-3469-4914-a63a-e38533850c0f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/dfd0ab25-3469-4914-a63a-e38533850c0f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 10:00:50 crc kubenswrapper[4990]: > logger="UnhandledError" Oct 03 10:00:50 crc kubenswrapper[4990]: E1003 10:00:50.835893 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/dfd0ab25-3469-4914-a63a-e38533850c0f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" podUID="dfd0ab25-3469-4914-a63a-e38533850c0f" Oct 03 10:00:50 crc kubenswrapper[4990]: E1003 10:00:50.852671 4990 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.146:54362->38.102.83.146:37319: write tcp 38.102.83.146:54362->38.102.83.146:37319: write: broken pipe Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.963974 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.965758 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.968222 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.968485 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.969924 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-w42nf" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.970088 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 10:00:50 crc kubenswrapper[4990]: I1003 10:00:50.985057 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.030377 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.101405 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.157724 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.158332 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" containerName="init" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.158398 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" containerName="init" Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.158452 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ef01a2-2d06-4dd4-ae0e-53eac15d8500" containerName="init" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.158556 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ef01a2-2d06-4dd4-ae0e-53eac15d8500" containerName="init" Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.158628 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" containerName="dnsmasq-dns" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.158690 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" containerName="dnsmasq-dns" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.158911 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" containerName="dnsmasq-dns" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.159041 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ef01a2-2d06-4dd4-ae0e-53eac15d8500" containerName="init" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.167879 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-dns-svc\") pod \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168030 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-config\") pod \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168215 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwpjx\" (UniqueName: \"kubernetes.io/projected/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-kube-api-access-rwpjx\") pod \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\" (UID: \"86ef01a2-2d06-4dd4-ae0e-53eac15d8500\") " Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168498 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168600 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b23a7883-8397-4262-a891-916de94739fd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168649 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168689 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168742 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7fm8\" (UniqueName: \"kubernetes.io/projected/b23a7883-8397-4262-a891-916de94739fd-kube-api-access-s7fm8\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168782 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.168854 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.172258 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.173723 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.181399 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.182211 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.183879 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-xvm86" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.186913 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-kube-api-access-rwpjx" (OuterVolumeSpecName: "kube-api-access-rwpjx") pod "86ef01a2-2d06-4dd4-ae0e-53eac15d8500" (UID: "86ef01a2-2d06-4dd4-ae0e-53eac15d8500"). InnerVolumeSpecName "kube-api-access-rwpjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.188901 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.194835 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-config" (OuterVolumeSpecName: "config") pod "86ef01a2-2d06-4dd4-ae0e-53eac15d8500" (UID: "86ef01a2-2d06-4dd4-ae0e-53eac15d8500"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.209723 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86ef01a2-2d06-4dd4-ae0e-53eac15d8500" (UID: "86ef01a2-2d06-4dd4-ae0e-53eac15d8500"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270248 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-dns-svc\") pod \"db03a21a-6575-489a-a9c8-cb6035363f12\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270402 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k77js\" (UniqueName: \"kubernetes.io/projected/db03a21a-6575-489a-a9c8-cb6035363f12-kube-api-access-k77js\") pod \"db03a21a-6575-489a-a9c8-cb6035363f12\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270449 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-config\") pod \"db03a21a-6575-489a-a9c8-cb6035363f12\" (UID: \"db03a21a-6575-489a-a9c8-cb6035363f12\") " Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270651 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-lock\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270682 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7fm8\" (UniqueName: \"kubernetes.io/projected/b23a7883-8397-4262-a891-916de94739fd-kube-api-access-s7fm8\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270702 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270719 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270737 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270791 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-cache\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270810 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270834 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270871 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ztd\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-kube-api-access-24ztd\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270896 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b23a7883-8397-4262-a891-916de94739fd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270930 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.270959 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.271023 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwpjx\" (UniqueName: \"kubernetes.io/projected/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-kube-api-access-rwpjx\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.271039 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.271051 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ef01a2-2d06-4dd4-ae0e-53eac15d8500-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.271215 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-4t8d6"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.272302 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.273027 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b23a7883-8397-4262-a891-916de94739fd-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.274196 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.277559 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.278194 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.279355 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.285155 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db03a21a-6575-489a-a9c8-cb6035363f12-kube-api-access-k77js" (OuterVolumeSpecName: "kube-api-access-k77js") pod "db03a21a-6575-489a-a9c8-cb6035363f12" (UID: "db03a21a-6575-489a-a9c8-cb6035363f12"). InnerVolumeSpecName "kube-api-access-k77js". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.295055 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7fm8\" (UniqueName: \"kubernetes.io/projected/b23a7883-8397-4262-a891-916de94739fd-kube-api-access-s7fm8\") pod \"ovn-northd-0\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.307733 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.350672 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db03a21a-6575-489a-a9c8-cb6035363f12" (UID: "db03a21a-6575-489a-a9c8-cb6035363f12"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.372490 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-lock\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.373749 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.373970 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.374527 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-cache\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.373705 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-lock\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.374403 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.373928 4990 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.376215 4990 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.376347 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift podName:cca92a2a-2e3d-4e52-8ed8-a4dc709915b6 nodeName:}" failed. No retries permitted until 2025-10-03 10:00:51.876322036 +0000 UTC m=+1033.672953893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift") pod "swift-storage-0" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6") : configmap "swift-ring-files" not found Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.375002 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-cache\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.375184 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ztd\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-kube-api-access-24ztd\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.380618 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k77js\" (UniqueName: \"kubernetes.io/projected/db03a21a-6575-489a-a9c8-cb6035363f12-kube-api-access-k77js\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.381169 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.381406 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-config" (OuterVolumeSpecName: "config") pod "db03a21a-6575-489a-a9c8-cb6035363f12" (UID: "db03a21a-6575-489a-a9c8-cb6035363f12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.403159 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ztd\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-kube-api-access-24ztd\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.413842 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.452650 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" event={"ID":"6630a347-e78f-4354-a4d2-2ba01ce1ab0c","Type":"ContainerStarted","Data":"e3e486c0b0f3632ca41794405d34dc7a483f79f5416821d9250871dab4f65109"} Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.454677 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" event={"ID":"86ef01a2-2d06-4dd4-ae0e-53eac15d8500","Type":"ContainerDied","Data":"ee476af2171e6aaf7a2b4234004143fb1b18438ce78b5efeeb255cd6b6a49505"} Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.454685 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b749bd587-w9mb4" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.454717 4990 scope.go:117] "RemoveContainer" containerID="87e847712ce89f6b0b8967bd83de7f499a15c4e274b6828a73d1f47baa1d42c5" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.459449 4990 generic.go:334] "Generic (PLEG): container finished" podID="768e70f6-3a6f-43ef-93a1-26750658f735" containerID="c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e" exitCode=0 Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.459703 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" event={"ID":"768e70f6-3a6f-43ef-93a1-26750658f735","Type":"ContainerDied","Data":"c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e"} Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.461970 4990 generic.go:334] "Generic (PLEG): container finished" podID="db03a21a-6575-489a-a9c8-cb6035363f12" containerID="720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c" exitCode=0 Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.462349 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.464474 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" event={"ID":"db03a21a-6575-489a-a9c8-cb6035363f12","Type":"ContainerDied","Data":"720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c"} Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.464541 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-sbw4c" event={"ID":"db03a21a-6575-489a-a9c8-cb6035363f12","Type":"ContainerDied","Data":"8c6613d18696259d3fd1829af0e66ab63a348732d43e546bd1e445bdc63a8dcc"} Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.487377 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db03a21a-6575-489a-a9c8-cb6035363f12-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.533633 4990 scope.go:117] "RemoveContainer" containerID="720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.548681 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b749bd587-w9mb4"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.556609 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b749bd587-w9mb4"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.570674 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-sbw4c"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.635435 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-sbw4c"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.684913 4990 scope.go:117] "RemoveContainer" containerID="5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.688937 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.698226 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-d5h6v"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.699456 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.706133 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.706327 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.706338 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.710930 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-d5h6v"] Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.753017 4990 scope.go:117] "RemoveContainer" containerID="720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c" Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.759991 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c\": container with ID starting with 720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c not found: ID does not exist" containerID="720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.760072 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c"} err="failed to get container status \"720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c\": rpc error: code = NotFound desc = could not find container \"720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c\": container with ID starting with 720d5679defe13df52c6aa6a2c520d33284c52aa27fe29ae9d29a5370c83e82c not found: ID does not exist" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.760115 4990 scope.go:117] "RemoveContainer" containerID="5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a" Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.760461 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a\": container with ID starting with 5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a not found: ID does not exist" containerID="5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.760495 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a"} err="failed to get container status \"5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a\": rpc error: code = NotFound desc = could not find container \"5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a\": container with ID starting with 5377d504df3e3fc409c81f1e3b1dde81c25c728f763e4b258a90615b6134da5a not found: ID does not exist" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.802586 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-combined-ca-bundle\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.802653 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-scripts\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.802714 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-swiftconf\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.802733 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-dispersionconf\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.802753 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-etc-swift\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.802788 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r97fp\" (UniqueName: \"kubernetes.io/projected/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-kube-api-access-r97fp\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.802817 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-ring-data-devices\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.843708 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.905403 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-ring-data-devices\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.905473 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.905571 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-combined-ca-bundle\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.905602 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-scripts\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.905638 4990 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.905662 4990 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 10:00:51 crc kubenswrapper[4990]: E1003 10:00:51.905711 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift podName:cca92a2a-2e3d-4e52-8ed8-a4dc709915b6 nodeName:}" failed. No retries permitted until 2025-10-03 10:00:52.905695599 +0000 UTC m=+1034.702327456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift") pod "swift-storage-0" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6") : configmap "swift-ring-files" not found Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.905643 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-swiftconf\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.905745 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-etc-swift\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.905762 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-dispersionconf\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.905788 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r97fp\" (UniqueName: \"kubernetes.io/projected/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-kube-api-access-r97fp\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.906258 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-etc-swift\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.906308 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-ring-data-devices\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.907064 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-scripts\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.912365 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-swiftconf\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.912820 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-dispersionconf\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.913767 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-combined-ca-bundle\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:51 crc kubenswrapper[4990]: I1003 10:00:51.926854 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r97fp\" (UniqueName: \"kubernetes.io/projected/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-kube-api-access-r97fp\") pod \"swift-ring-rebalance-d5h6v\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.007397 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-config\") pod \"dfd0ab25-3469-4914-a63a-e38533850c0f\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.008109 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-ovsdbserver-sb\") pod \"dfd0ab25-3469-4914-a63a-e38533850c0f\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.008404 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbcmn\" (UniqueName: \"kubernetes.io/projected/dfd0ab25-3469-4914-a63a-e38533850c0f-kube-api-access-dbcmn\") pod \"dfd0ab25-3469-4914-a63a-e38533850c0f\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.008906 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-dns-svc\") pod \"dfd0ab25-3469-4914-a63a-e38533850c0f\" (UID: \"dfd0ab25-3469-4914-a63a-e38533850c0f\") " Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.011952 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd0ab25-3469-4914-a63a-e38533850c0f-kube-api-access-dbcmn" (OuterVolumeSpecName: "kube-api-access-dbcmn") pod "dfd0ab25-3469-4914-a63a-e38533850c0f" (UID: "dfd0ab25-3469-4914-a63a-e38533850c0f"). InnerVolumeSpecName "kube-api-access-dbcmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.027400 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.070350 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfd0ab25-3469-4914-a63a-e38533850c0f" (UID: "dfd0ab25-3469-4914-a63a-e38533850c0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.077087 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfd0ab25-3469-4914-a63a-e38533850c0f" (UID: "dfd0ab25-3469-4914-a63a-e38533850c0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.081879 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-config" (OuterVolumeSpecName: "config") pod "dfd0ab25-3469-4914-a63a-e38533850c0f" (UID: "dfd0ab25-3469-4914-a63a-e38533850c0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.112060 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.112118 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.112132 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfd0ab25-3469-4914-a63a-e38533850c0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.112148 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbcmn\" (UniqueName: \"kubernetes.io/projected/dfd0ab25-3469-4914-a63a-e38533850c0f-kube-api-access-dbcmn\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.473698 4990 generic.go:334] "Generic (PLEG): container finished" podID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" containerID="6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe" exitCode=0 Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.473776 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" event={"ID":"6630a347-e78f-4354-a4d2-2ba01ce1ab0c","Type":"ContainerDied","Data":"6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe"} Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.475688 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" event={"ID":"dfd0ab25-3469-4914-a63a-e38533850c0f","Type":"ContainerDied","Data":"b3c7bef025844f505516a448f3edd02578111b994b29cb1b6a38cb947e6eeec3"} Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.475734 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9bfff7c-2dtbh" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.475749 4990 scope.go:117] "RemoveContainer" containerID="89b6ed3ecf90b015c7777f57abfa512b2589fe07b30b0c80f5f648cc07d56c95" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.479043 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b23a7883-8397-4262-a891-916de94739fd","Type":"ContainerStarted","Data":"f27f94a578be160c2c3eb32cf78425c6c1a79bdc65cb119b96898f5a29032847"} Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.494316 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" event={"ID":"768e70f6-3a6f-43ef-93a1-26750658f735","Type":"ContainerStarted","Data":"2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27"} Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.495319 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.548847 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-d5h6v"] Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.639241 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" podStartSLOduration=3.639218279 podStartE2EDuration="3.639218279s" podCreationTimestamp="2025-10-03 10:00:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:00:52.559277759 +0000 UTC m=+1034.355909616" watchObservedRunningTime="2025-10-03 10:00:52.639218279 +0000 UTC m=+1034.435850136" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.650637 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc9bfff7c-2dtbh"] Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.653744 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc9bfff7c-2dtbh"] Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.892736 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ef01a2-2d06-4dd4-ae0e-53eac15d8500" path="/var/lib/kubelet/pods/86ef01a2-2d06-4dd4-ae0e-53eac15d8500/volumes" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.894891 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db03a21a-6575-489a-a9c8-cb6035363f12" path="/var/lib/kubelet/pods/db03a21a-6575-489a-a9c8-cb6035363f12/volumes" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.895564 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd0ab25-3469-4914-a63a-e38533850c0f" path="/var/lib/kubelet/pods/dfd0ab25-3469-4914-a63a-e38533850c0f/volumes" Oct 03 10:00:52 crc kubenswrapper[4990]: I1003 10:00:52.929466 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:52 crc kubenswrapper[4990]: E1003 10:00:52.929667 4990 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 10:00:52 crc kubenswrapper[4990]: E1003 10:00:52.929694 4990 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 10:00:52 crc kubenswrapper[4990]: E1003 10:00:52.929753 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift podName:cca92a2a-2e3d-4e52-8ed8-a4dc709915b6 nodeName:}" failed. No retries permitted until 2025-10-03 10:00:54.929732974 +0000 UTC m=+1036.726364831 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift") pod "swift-storage-0" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6") : configmap "swift-ring-files" not found Oct 03 10:00:53 crc kubenswrapper[4990]: I1003 10:00:53.510527 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" event={"ID":"6630a347-e78f-4354-a4d2-2ba01ce1ab0c","Type":"ContainerStarted","Data":"3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3"} Oct 03 10:00:53 crc kubenswrapper[4990]: I1003 10:00:53.510616 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:00:53 crc kubenswrapper[4990]: I1003 10:00:53.514491 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d5h6v" event={"ID":"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7","Type":"ContainerStarted","Data":"40913e1cfac0f94925a5ba203254e163ad68f9ca3f6bfbea99cab245f0db8438"} Oct 03 10:00:53 crc kubenswrapper[4990]: I1003 10:00:53.540560 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" podStartSLOduration=3.540537835 podStartE2EDuration="3.540537835s" podCreationTimestamp="2025-10-03 10:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:00:53.530689053 +0000 UTC m=+1035.327320920" watchObservedRunningTime="2025-10-03 10:00:53.540537835 +0000 UTC m=+1035.337169702" Oct 03 10:00:54 crc kubenswrapper[4990]: I1003 10:00:54.966917 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:54 crc kubenswrapper[4990]: E1003 10:00:54.967799 4990 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 10:00:54 crc kubenswrapper[4990]: E1003 10:00:54.967818 4990 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 10:00:54 crc kubenswrapper[4990]: E1003 10:00:54.967862 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift podName:cca92a2a-2e3d-4e52-8ed8-a4dc709915b6 nodeName:}" failed. No retries permitted until 2025-10-03 10:00:58.96784372 +0000 UTC m=+1040.764475637 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift") pod "swift-storage-0" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6") : configmap "swift-ring-files" not found Oct 03 10:00:56 crc kubenswrapper[4990]: I1003 10:00:56.703577 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 10:00:56 crc kubenswrapper[4990]: I1003 10:00:56.703700 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 10:00:56 crc kubenswrapper[4990]: I1003 10:00:56.768181 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 10:00:57 crc kubenswrapper[4990]: I1003 10:00:57.604362 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 10:00:57 crc kubenswrapper[4990]: I1003 10:00:57.714315 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:57 crc kubenswrapper[4990]: I1003 10:00:57.714381 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 10:00:57 crc kubenswrapper[4990]: I1003 10:00:57.898712 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-88xpx"] Oct 03 10:00:57 crc kubenswrapper[4990]: E1003 10:00:57.899206 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd0ab25-3469-4914-a63a-e38533850c0f" containerName="init" Oct 03 10:00:57 crc kubenswrapper[4990]: I1003 10:00:57.899230 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd0ab25-3469-4914-a63a-e38533850c0f" containerName="init" Oct 03 10:00:57 crc kubenswrapper[4990]: I1003 10:00:57.899414 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd0ab25-3469-4914-a63a-e38533850c0f" containerName="init" Oct 03 10:00:57 crc kubenswrapper[4990]: I1003 10:00:57.900328 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-88xpx" Oct 03 10:00:57 crc kubenswrapper[4990]: I1003 10:00:57.909567 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-88xpx"] Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.035821 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw5js\" (UniqueName: \"kubernetes.io/projected/9eda8f25-c4cb-4c1d-86cf-d09703e3c953-kube-api-access-kw5js\") pod \"keystone-db-create-88xpx\" (UID: \"9eda8f25-c4cb-4c1d-86cf-d09703e3c953\") " pod="openstack/keystone-db-create-88xpx" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.066584 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-n8kfh"] Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.068064 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n8kfh" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.079161 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n8kfh"] Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.137836 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c92tc\" (UniqueName: \"kubernetes.io/projected/544a2832-1a0d-4251-8087-8321f2f24908-kube-api-access-c92tc\") pod \"placement-db-create-n8kfh\" (UID: \"544a2832-1a0d-4251-8087-8321f2f24908\") " pod="openstack/placement-db-create-n8kfh" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.138208 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw5js\" (UniqueName: \"kubernetes.io/projected/9eda8f25-c4cb-4c1d-86cf-d09703e3c953-kube-api-access-kw5js\") pod \"keystone-db-create-88xpx\" (UID: \"9eda8f25-c4cb-4c1d-86cf-d09703e3c953\") " pod="openstack/keystone-db-create-88xpx" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.159152 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw5js\" (UniqueName: \"kubernetes.io/projected/9eda8f25-c4cb-4c1d-86cf-d09703e3c953-kube-api-access-kw5js\") pod \"keystone-db-create-88xpx\" (UID: \"9eda8f25-c4cb-4c1d-86cf-d09703e3c953\") " pod="openstack/keystone-db-create-88xpx" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.229510 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-88xpx" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.239563 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c92tc\" (UniqueName: \"kubernetes.io/projected/544a2832-1a0d-4251-8087-8321f2f24908-kube-api-access-c92tc\") pod \"placement-db-create-n8kfh\" (UID: \"544a2832-1a0d-4251-8087-8321f2f24908\") " pod="openstack/placement-db-create-n8kfh" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.256316 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c92tc\" (UniqueName: \"kubernetes.io/projected/544a2832-1a0d-4251-8087-8321f2f24908-kube-api-access-c92tc\") pod \"placement-db-create-n8kfh\" (UID: \"544a2832-1a0d-4251-8087-8321f2f24908\") " pod="openstack/placement-db-create-n8kfh" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.334025 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rxq97"] Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.335008 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rxq97" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.345354 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rxq97"] Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.395245 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n8kfh" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.443556 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65mb\" (UniqueName: \"kubernetes.io/projected/7a464354-7171-4c4d-8d90-cdd1e8d35803-kube-api-access-m65mb\") pod \"glance-db-create-rxq97\" (UID: \"7a464354-7171-4c4d-8d90-cdd1e8d35803\") " pod="openstack/glance-db-create-rxq97" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.545468 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65mb\" (UniqueName: \"kubernetes.io/projected/7a464354-7171-4c4d-8d90-cdd1e8d35803-kube-api-access-m65mb\") pod \"glance-db-create-rxq97\" (UID: \"7a464354-7171-4c4d-8d90-cdd1e8d35803\") " pod="openstack/glance-db-create-rxq97" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.564617 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65mb\" (UniqueName: \"kubernetes.io/projected/7a464354-7171-4c4d-8d90-cdd1e8d35803-kube-api-access-m65mb\") pod \"glance-db-create-rxq97\" (UID: \"7a464354-7171-4c4d-8d90-cdd1e8d35803\") " pod="openstack/glance-db-create-rxq97" Oct 03 10:00:58 crc kubenswrapper[4990]: I1003 10:00:58.651343 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rxq97" Oct 03 10:00:59 crc kubenswrapper[4990]: I1003 10:00:59.055359 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:00:59 crc kubenswrapper[4990]: E1003 10:00:59.055577 4990 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 10:00:59 crc kubenswrapper[4990]: E1003 10:00:59.055607 4990 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 10:00:59 crc kubenswrapper[4990]: E1003 10:00:59.055663 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift podName:cca92a2a-2e3d-4e52-8ed8-a4dc709915b6 nodeName:}" failed. No retries permitted until 2025-10-03 10:01:07.055646376 +0000 UTC m=+1048.852278233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift") pod "swift-storage-0" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6") : configmap "swift-ring-files" not found Oct 03 10:00:59 crc kubenswrapper[4990]: I1003 10:00:59.576203 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.314078 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.361297 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-88xpx"] Oct 03 10:01:00 crc kubenswrapper[4990]: W1003 10:01:00.364806 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a464354_7171_4c4d_8d90_cdd1e8d35803.slice/crio-f8749b96a205e6bbdfbfcd1377747127e189529a6d7a53c6eaacd5924ab50006 WatchSource:0}: Error finding container f8749b96a205e6bbdfbfcd1377747127e189529a6d7a53c6eaacd5924ab50006: Status 404 returned error can't find the container with id f8749b96a205e6bbdfbfcd1377747127e189529a6d7a53c6eaacd5924ab50006 Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.369300 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rxq97"] Oct 03 10:01:00 crc kubenswrapper[4990]: W1003 10:01:00.380553 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eda8f25_c4cb_4c1d_86cf_d09703e3c953.slice/crio-ec1e3b4fa7601749643c1628a98b2a7d0eb588a02d3e89000614597add029255 WatchSource:0}: Error finding container ec1e3b4fa7601749643c1628a98b2a7d0eb588a02d3e89000614597add029255: Status 404 returned error can't find the container with id ec1e3b4fa7601749643c1628a98b2a7d0eb588a02d3e89000614597add029255 Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.382956 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.476773 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n8kfh"] Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.490862 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.569432 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-lr2jj"] Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.579078 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n8kfh" event={"ID":"544a2832-1a0d-4251-8087-8321f2f24908","Type":"ContainerStarted","Data":"598f245d94875e1ae0eac15a6395543650eee4eac72db379481e79d41f1ebf23"} Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.583201 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b23a7883-8397-4262-a891-916de94739fd","Type":"ContainerStarted","Data":"6b3ab8b29ea9f1b7b6e8f40edcf60f82f96ba6bf30e0f0a4afbe62bd168e8f7a"} Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.583278 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b23a7883-8397-4262-a891-916de94739fd","Type":"ContainerStarted","Data":"7871e236e74cb6ae1f5cad66ad4b89c2125e25150e40edb22d50c75bed041cb2"} Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.583340 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.596893 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rxq97" event={"ID":"7a464354-7171-4c4d-8d90-cdd1e8d35803","Type":"ContainerStarted","Data":"f8749b96a205e6bbdfbfcd1377747127e189529a6d7a53c6eaacd5924ab50006"} Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.602500 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-88xpx" event={"ID":"9eda8f25-c4cb-4c1d-86cf-d09703e3c953","Type":"ContainerStarted","Data":"ec1e3b4fa7601749643c1628a98b2a7d0eb588a02d3e89000614597add029255"} Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.602745 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" podUID="768e70f6-3a6f-43ef-93a1-26750658f735" containerName="dnsmasq-dns" containerID="cri-o://2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27" gracePeriod=10 Oct 03 10:01:00 crc kubenswrapper[4990]: I1003 10:01:00.619015 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.34119757 podStartE2EDuration="10.618997789s" podCreationTimestamp="2025-10-03 10:00:50 +0000 UTC" firstStartedPulling="2025-10-03 10:00:51.752957673 +0000 UTC m=+1033.549589530" lastFinishedPulling="2025-10-03 10:01:00.030757892 +0000 UTC m=+1041.827389749" observedRunningTime="2025-10-03 10:01:00.616895087 +0000 UTC m=+1042.413526964" watchObservedRunningTime="2025-10-03 10:01:00.618997789 +0000 UTC m=+1042.415629646" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.230027 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.329596 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-sb\") pod \"768e70f6-3a6f-43ef-93a1-26750658f735\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.329707 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xkwp\" (UniqueName: \"kubernetes.io/projected/768e70f6-3a6f-43ef-93a1-26750658f735-kube-api-access-4xkwp\") pod \"768e70f6-3a6f-43ef-93a1-26750658f735\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.329768 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-config\") pod \"768e70f6-3a6f-43ef-93a1-26750658f735\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.329873 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-nb\") pod \"768e70f6-3a6f-43ef-93a1-26750658f735\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.329904 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-dns-svc\") pod \"768e70f6-3a6f-43ef-93a1-26750658f735\" (UID: \"768e70f6-3a6f-43ef-93a1-26750658f735\") " Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.336645 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768e70f6-3a6f-43ef-93a1-26750658f735-kube-api-access-4xkwp" (OuterVolumeSpecName: "kube-api-access-4xkwp") pod "768e70f6-3a6f-43ef-93a1-26750658f735" (UID: "768e70f6-3a6f-43ef-93a1-26750658f735"). InnerVolumeSpecName "kube-api-access-4xkwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.377809 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-config" (OuterVolumeSpecName: "config") pod "768e70f6-3a6f-43ef-93a1-26750658f735" (UID: "768e70f6-3a6f-43ef-93a1-26750658f735"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.379547 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "768e70f6-3a6f-43ef-93a1-26750658f735" (UID: "768e70f6-3a6f-43ef-93a1-26750658f735"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.393677 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "768e70f6-3a6f-43ef-93a1-26750658f735" (UID: "768e70f6-3a6f-43ef-93a1-26750658f735"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.409070 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "768e70f6-3a6f-43ef-93a1-26750658f735" (UID: "768e70f6-3a6f-43ef-93a1-26750658f735"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.434658 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.434724 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xkwp\" (UniqueName: \"kubernetes.io/projected/768e70f6-3a6f-43ef-93a1-26750658f735-kube-api-access-4xkwp\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.434741 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.434763 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.434776 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/768e70f6-3a6f-43ef-93a1-26750658f735-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.615546 4990 generic.go:334] "Generic (PLEG): container finished" podID="544a2832-1a0d-4251-8087-8321f2f24908" containerID="18687cd53680abd930c5680f7523fd3629930d839ab92a472f3af82c58a7c9b4" exitCode=0 Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.616156 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n8kfh" event={"ID":"544a2832-1a0d-4251-8087-8321f2f24908","Type":"ContainerDied","Data":"18687cd53680abd930c5680f7523fd3629930d839ab92a472f3af82c58a7c9b4"} Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.620166 4990 generic.go:334] "Generic (PLEG): container finished" podID="7a464354-7171-4c4d-8d90-cdd1e8d35803" containerID="0e8e95a6d709c8f44add9d81ff967c2d33db6287fdfd87854e697b2031a80bfe" exitCode=0 Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.620241 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rxq97" event={"ID":"7a464354-7171-4c4d-8d90-cdd1e8d35803","Type":"ContainerDied","Data":"0e8e95a6d709c8f44add9d81ff967c2d33db6287fdfd87854e697b2031a80bfe"} Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.622205 4990 generic.go:334] "Generic (PLEG): container finished" podID="9eda8f25-c4cb-4c1d-86cf-d09703e3c953" containerID="0f36f93a58a4abcefc2399a68eec1ce5537116c70acf225f4a2cad1ad503ac9d" exitCode=0 Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.622262 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-88xpx" event={"ID":"9eda8f25-c4cb-4c1d-86cf-d09703e3c953","Type":"ContainerDied","Data":"0f36f93a58a4abcefc2399a68eec1ce5537116c70acf225f4a2cad1ad503ac9d"} Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.624264 4990 generic.go:334] "Generic (PLEG): container finished" podID="768e70f6-3a6f-43ef-93a1-26750658f735" containerID="2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27" exitCode=0 Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.624305 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.624337 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" event={"ID":"768e70f6-3a6f-43ef-93a1-26750658f735","Type":"ContainerDied","Data":"2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27"} Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.624363 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-lr2jj" event={"ID":"768e70f6-3a6f-43ef-93a1-26750658f735","Type":"ContainerDied","Data":"9354619f37e7ad4d5752c1ac636656c4d5a3bab86b321519176a1dba70c85342"} Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.624387 4990 scope.go:117] "RemoveContainer" containerID="2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27" Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.682292 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-lr2jj"] Oct 03 10:01:01 crc kubenswrapper[4990]: I1003 10:01:01.690277 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-lr2jj"] Oct 03 10:01:02 crc kubenswrapper[4990]: I1003 10:01:02.886367 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768e70f6-3a6f-43ef-93a1-26750658f735" path="/var/lib/kubelet/pods/768e70f6-3a6f-43ef-93a1-26750658f735/volumes" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.445863 4990 scope.go:117] "RemoveContainer" containerID="c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.615610 4990 scope.go:117] "RemoveContainer" containerID="2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27" Oct 03 10:01:03 crc kubenswrapper[4990]: E1003 10:01:03.616326 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27\": container with ID starting with 2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27 not found: ID does not exist" containerID="2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.616360 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27"} err="failed to get container status \"2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27\": rpc error: code = NotFound desc = could not find container \"2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27\": container with ID starting with 2194c6e96424266a6eae6717a5a1d9c7841d75700bbda390609a3b73c9c72f27 not found: ID does not exist" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.616385 4990 scope.go:117] "RemoveContainer" containerID="c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e" Oct 03 10:01:03 crc kubenswrapper[4990]: E1003 10:01:03.616842 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e\": container with ID starting with c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e not found: ID does not exist" containerID="c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.616885 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e"} err="failed to get container status \"c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e\": rpc error: code = NotFound desc = could not find container \"c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e\": container with ID starting with c343f362498d998a02ef7e64a04c5cb2a1bec3100eed50f5e05fdbb2d468bc0e not found: ID does not exist" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.631971 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rxq97" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.638283 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n8kfh" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.641499 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n8kfh" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.641539 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n8kfh" event={"ID":"544a2832-1a0d-4251-8087-8321f2f24908","Type":"ContainerDied","Data":"598f245d94875e1ae0eac15a6395543650eee4eac72db379481e79d41f1ebf23"} Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.641596 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="598f245d94875e1ae0eac15a6395543650eee4eac72db379481e79d41f1ebf23" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.643415 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rxq97" event={"ID":"7a464354-7171-4c4d-8d90-cdd1e8d35803","Type":"ContainerDied","Data":"f8749b96a205e6bbdfbfcd1377747127e189529a6d7a53c6eaacd5924ab50006"} Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.643468 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8749b96a205e6bbdfbfcd1377747127e189529a6d7a53c6eaacd5924ab50006" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.643540 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rxq97" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.648264 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-88xpx" event={"ID":"9eda8f25-c4cb-4c1d-86cf-d09703e3c953","Type":"ContainerDied","Data":"ec1e3b4fa7601749643c1628a98b2a7d0eb588a02d3e89000614597add029255"} Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.648303 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec1e3b4fa7601749643c1628a98b2a7d0eb588a02d3e89000614597add029255" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.667834 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-88xpx" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.682372 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c92tc\" (UniqueName: \"kubernetes.io/projected/544a2832-1a0d-4251-8087-8321f2f24908-kube-api-access-c92tc\") pod \"544a2832-1a0d-4251-8087-8321f2f24908\" (UID: \"544a2832-1a0d-4251-8087-8321f2f24908\") " Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.682427 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m65mb\" (UniqueName: \"kubernetes.io/projected/7a464354-7171-4c4d-8d90-cdd1e8d35803-kube-api-access-m65mb\") pod \"7a464354-7171-4c4d-8d90-cdd1e8d35803\" (UID: \"7a464354-7171-4c4d-8d90-cdd1e8d35803\") " Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.689302 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a464354-7171-4c4d-8d90-cdd1e8d35803-kube-api-access-m65mb" (OuterVolumeSpecName: "kube-api-access-m65mb") pod "7a464354-7171-4c4d-8d90-cdd1e8d35803" (UID: "7a464354-7171-4c4d-8d90-cdd1e8d35803"). InnerVolumeSpecName "kube-api-access-m65mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.689895 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544a2832-1a0d-4251-8087-8321f2f24908-kube-api-access-c92tc" (OuterVolumeSpecName: "kube-api-access-c92tc") pod "544a2832-1a0d-4251-8087-8321f2f24908" (UID: "544a2832-1a0d-4251-8087-8321f2f24908"). InnerVolumeSpecName "kube-api-access-c92tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.783823 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw5js\" (UniqueName: \"kubernetes.io/projected/9eda8f25-c4cb-4c1d-86cf-d09703e3c953-kube-api-access-kw5js\") pod \"9eda8f25-c4cb-4c1d-86cf-d09703e3c953\" (UID: \"9eda8f25-c4cb-4c1d-86cf-d09703e3c953\") " Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.784702 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c92tc\" (UniqueName: \"kubernetes.io/projected/544a2832-1a0d-4251-8087-8321f2f24908-kube-api-access-c92tc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.784728 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m65mb\" (UniqueName: \"kubernetes.io/projected/7a464354-7171-4c4d-8d90-cdd1e8d35803-kube-api-access-m65mb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.787262 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eda8f25-c4cb-4c1d-86cf-d09703e3c953-kube-api-access-kw5js" (OuterVolumeSpecName: "kube-api-access-kw5js") pod "9eda8f25-c4cb-4c1d-86cf-d09703e3c953" (UID: "9eda8f25-c4cb-4c1d-86cf-d09703e3c953"). InnerVolumeSpecName "kube-api-access-kw5js". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:03 crc kubenswrapper[4990]: I1003 10:01:03.886296 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw5js\" (UniqueName: \"kubernetes.io/projected/9eda8f25-c4cb-4c1d-86cf-d09703e3c953-kube-api-access-kw5js\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:04 crc kubenswrapper[4990]: I1003 10:01:04.664973 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d5h6v" event={"ID":"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7","Type":"ContainerStarted","Data":"d5ee1abf7a163799410b1511c4a9e8f15feba5f3acad686e5c5172038d090f40"} Oct 03 10:01:04 crc kubenswrapper[4990]: I1003 10:01:04.665077 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-88xpx" Oct 03 10:01:04 crc kubenswrapper[4990]: I1003 10:01:04.693564 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-d5h6v" podStartSLOduration=2.778057689 podStartE2EDuration="13.693512958s" podCreationTimestamp="2025-10-03 10:00:51 +0000 UTC" firstStartedPulling="2025-10-03 10:00:52.582273563 +0000 UTC m=+1034.378905420" lastFinishedPulling="2025-10-03 10:01:03.497728832 +0000 UTC m=+1045.294360689" observedRunningTime="2025-10-03 10:01:04.680836387 +0000 UTC m=+1046.477468264" watchObservedRunningTime="2025-10-03 10:01:04.693512958 +0000 UTC m=+1046.490144835" Oct 03 10:01:07 crc kubenswrapper[4990]: I1003 10:01:07.144418 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:01:07 crc kubenswrapper[4990]: E1003 10:01:07.144635 4990 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 10:01:07 crc kubenswrapper[4990]: E1003 10:01:07.144890 4990 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 10:01:07 crc kubenswrapper[4990]: E1003 10:01:07.144950 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift podName:cca92a2a-2e3d-4e52-8ed8-a4dc709915b6 nodeName:}" failed. No retries permitted until 2025-10-03 10:01:23.144926832 +0000 UTC m=+1064.941558689 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift") pod "swift-storage-0" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6") : configmap "swift-ring-files" not found Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.452679 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5d27-account-create-qmpzq"] Oct 03 10:01:08 crc kubenswrapper[4990]: E1003 10:01:08.453270 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eda8f25-c4cb-4c1d-86cf-d09703e3c953" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453282 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eda8f25-c4cb-4c1d-86cf-d09703e3c953" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: E1003 10:01:08.453290 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768e70f6-3a6f-43ef-93a1-26750658f735" containerName="init" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453295 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="768e70f6-3a6f-43ef-93a1-26750658f735" containerName="init" Oct 03 10:01:08 crc kubenswrapper[4990]: E1003 10:01:08.453306 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768e70f6-3a6f-43ef-93a1-26750658f735" containerName="dnsmasq-dns" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453312 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="768e70f6-3a6f-43ef-93a1-26750658f735" containerName="dnsmasq-dns" Oct 03 10:01:08 crc kubenswrapper[4990]: E1003 10:01:08.453327 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a464354-7171-4c4d-8d90-cdd1e8d35803" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453332 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a464354-7171-4c4d-8d90-cdd1e8d35803" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: E1003 10:01:08.453350 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544a2832-1a0d-4251-8087-8321f2f24908" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453356 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="544a2832-1a0d-4251-8087-8321f2f24908" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453535 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="768e70f6-3a6f-43ef-93a1-26750658f735" containerName="dnsmasq-dns" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453550 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="544a2832-1a0d-4251-8087-8321f2f24908" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453573 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a464354-7171-4c4d-8d90-cdd1e8d35803" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.453582 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eda8f25-c4cb-4c1d-86cf-d09703e3c953" containerName="mariadb-database-create" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.454193 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d27-account-create-qmpzq" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.456606 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.468800 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5d27-account-create-qmpzq"] Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.569849 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwzrp\" (UniqueName: \"kubernetes.io/projected/372689e3-d306-4bc7-86eb-b44920f77a78-kube-api-access-nwzrp\") pod \"glance-5d27-account-create-qmpzq\" (UID: \"372689e3-d306-4bc7-86eb-b44920f77a78\") " pod="openstack/glance-5d27-account-create-qmpzq" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.671203 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwzrp\" (UniqueName: \"kubernetes.io/projected/372689e3-d306-4bc7-86eb-b44920f77a78-kube-api-access-nwzrp\") pod \"glance-5d27-account-create-qmpzq\" (UID: \"372689e3-d306-4bc7-86eb-b44920f77a78\") " pod="openstack/glance-5d27-account-create-qmpzq" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.699563 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwzrp\" (UniqueName: \"kubernetes.io/projected/372689e3-d306-4bc7-86eb-b44920f77a78-kube-api-access-nwzrp\") pod \"glance-5d27-account-create-qmpzq\" (UID: \"372689e3-d306-4bc7-86eb-b44920f77a78\") " pod="openstack/glance-5d27-account-create-qmpzq" Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.700907 4990 generic.go:334] "Generic (PLEG): container finished" podID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" containerID="46a2006f531297eb507c3d080522e05f935cfe53f8d27382af0ef0806a9315a1" exitCode=0 Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.700988 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6624a04-5ca4-4651-a91e-0a67f97c51b5","Type":"ContainerDied","Data":"46a2006f531297eb507c3d080522e05f935cfe53f8d27382af0ef0806a9315a1"} Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.703927 4990 generic.go:334] "Generic (PLEG): container finished" podID="51461d28-e850-4ba3-8f27-0252b51903f1" containerID="b4d7bb564bea84a147b666614fdc109dbffb801e15bd785bda000207cddae019" exitCode=0 Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.703972 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51461d28-e850-4ba3-8f27-0252b51903f1","Type":"ContainerDied","Data":"b4d7bb564bea84a147b666614fdc109dbffb801e15bd785bda000207cddae019"} Oct 03 10:01:08 crc kubenswrapper[4990]: I1003 10:01:08.770870 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d27-account-create-qmpzq" Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.200078 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5d27-account-create-qmpzq"] Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.713224 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6624a04-5ca4-4651-a91e-0a67f97c51b5","Type":"ContainerStarted","Data":"39ca9311a2a836a4c7c4e3966e9e9682edd65a8a88709915dec6b0db579a3d63"} Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.714545 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.717215 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51461d28-e850-4ba3-8f27-0252b51903f1","Type":"ContainerStarted","Data":"6e738f47b293c30c60c0c8652362ba0397b75b7dc42d631b6865d77252a55f68"} Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.717479 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.719322 4990 generic.go:334] "Generic (PLEG): container finished" podID="372689e3-d306-4bc7-86eb-b44920f77a78" containerID="21d85a37e8f6cc7bb88e6ce18cde1192ecd28961bc25f414c7f82da286e51f23" exitCode=0 Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.719363 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d27-account-create-qmpzq" event={"ID":"372689e3-d306-4bc7-86eb-b44920f77a78","Type":"ContainerDied","Data":"21d85a37e8f6cc7bb88e6ce18cde1192ecd28961bc25f414c7f82da286e51f23"} Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.719388 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d27-account-create-qmpzq" event={"ID":"372689e3-d306-4bc7-86eb-b44920f77a78","Type":"ContainerStarted","Data":"47a4825519546dda219fed1a55b290d99407b1a4230bf29381aab3d9fd10ee86"} Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.751623 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.024992201 podStartE2EDuration="56.751602822s" podCreationTimestamp="2025-10-03 10:00:13 +0000 UTC" firstStartedPulling="2025-10-03 10:00:19.871231642 +0000 UTC m=+1001.667863499" lastFinishedPulling="2025-10-03 10:00:34.597842263 +0000 UTC m=+1016.394474120" observedRunningTime="2025-10-03 10:01:09.748152097 +0000 UTC m=+1051.544783964" watchObservedRunningTime="2025-10-03 10:01:09.751602822 +0000 UTC m=+1051.548234679" Oct 03 10:01:09 crc kubenswrapper[4990]: I1003 10:01:09.791470 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.594713765 podStartE2EDuration="56.791451179s" podCreationTimestamp="2025-10-03 10:00:13 +0000 UTC" firstStartedPulling="2025-10-03 10:00:15.440028413 +0000 UTC m=+997.236660270" lastFinishedPulling="2025-10-03 10:00:34.636765837 +0000 UTC m=+1016.433397684" observedRunningTime="2025-10-03 10:01:09.78577536 +0000 UTC m=+1051.582407227" watchObservedRunningTime="2025-10-03 10:01:09.791451179 +0000 UTC m=+1051.588083036" Oct 03 10:01:10 crc kubenswrapper[4990]: I1003 10:01:10.728658 4990 generic.go:334] "Generic (PLEG): container finished" podID="ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" containerID="d5ee1abf7a163799410b1511c4a9e8f15feba5f3acad686e5c5172038d090f40" exitCode=0 Oct 03 10:01:10 crc kubenswrapper[4990]: I1003 10:01:10.728786 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d5h6v" event={"ID":"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7","Type":"ContainerDied","Data":"d5ee1abf7a163799410b1511c4a9e8f15feba5f3acad686e5c5172038d090f40"} Oct 03 10:01:11 crc kubenswrapper[4990]: I1003 10:01:11.074148 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d27-account-create-qmpzq" Oct 03 10:01:11 crc kubenswrapper[4990]: I1003 10:01:11.220242 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwzrp\" (UniqueName: \"kubernetes.io/projected/372689e3-d306-4bc7-86eb-b44920f77a78-kube-api-access-nwzrp\") pod \"372689e3-d306-4bc7-86eb-b44920f77a78\" (UID: \"372689e3-d306-4bc7-86eb-b44920f77a78\") " Oct 03 10:01:11 crc kubenswrapper[4990]: I1003 10:01:11.238586 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372689e3-d306-4bc7-86eb-b44920f77a78-kube-api-access-nwzrp" (OuterVolumeSpecName: "kube-api-access-nwzrp") pod "372689e3-d306-4bc7-86eb-b44920f77a78" (UID: "372689e3-d306-4bc7-86eb-b44920f77a78"). InnerVolumeSpecName "kube-api-access-nwzrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:11 crc kubenswrapper[4990]: I1003 10:01:11.322954 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwzrp\" (UniqueName: \"kubernetes.io/projected/372689e3-d306-4bc7-86eb-b44920f77a78-kube-api-access-nwzrp\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:11 crc kubenswrapper[4990]: I1003 10:01:11.395865 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 10:01:11 crc kubenswrapper[4990]: I1003 10:01:11.737510 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d27-account-create-qmpzq" event={"ID":"372689e3-d306-4bc7-86eb-b44920f77a78","Type":"ContainerDied","Data":"47a4825519546dda219fed1a55b290d99407b1a4230bf29381aab3d9fd10ee86"} Oct 03 10:01:11 crc kubenswrapper[4990]: I1003 10:01:11.737607 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47a4825519546dda219fed1a55b290d99407b1a4230bf29381aab3d9fd10ee86" Oct 03 10:01:11 crc kubenswrapper[4990]: I1003 10:01:11.737567 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d27-account-create-qmpzq" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.049038 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.135083 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r97fp\" (UniqueName: \"kubernetes.io/projected/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-kube-api-access-r97fp\") pod \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.135146 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-etc-swift\") pod \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.135229 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-ring-data-devices\") pod \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.135248 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-combined-ca-bundle\") pod \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.135329 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-dispersionconf\") pod \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.135352 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-swiftconf\") pod \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.135378 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-scripts\") pod \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\" (UID: \"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7\") " Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.135773 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" (UID: "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.136041 4990 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.136100 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" (UID: "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.140732 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-kube-api-access-r97fp" (OuterVolumeSpecName: "kube-api-access-r97fp") pod "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" (UID: "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7"). InnerVolumeSpecName "kube-api-access-r97fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.143528 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" (UID: "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.163286 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-scripts" (OuterVolumeSpecName: "scripts") pod "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" (UID: "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.163808 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" (UID: "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.176151 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" (UID: "ca95bca6-8a90-4f5e-a615-ac88ab3b1be7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.237274 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r97fp\" (UniqueName: \"kubernetes.io/projected/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-kube-api-access-r97fp\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.237318 4990 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.237332 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.237344 4990 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.237356 4990 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.237370 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.746189 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d5h6v" event={"ID":"ca95bca6-8a90-4f5e-a615-ac88ab3b1be7","Type":"ContainerDied","Data":"40913e1cfac0f94925a5ba203254e163ad68f9ca3f6bfbea99cab245f0db8438"} Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.746225 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d5h6v" Oct 03 10:01:12 crc kubenswrapper[4990]: I1003 10:01:12.746237 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40913e1cfac0f94925a5ba203254e163ad68f9ca3f6bfbea99cab245f0db8438" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.598526 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-4h7gg"] Oct 03 10:01:13 crc kubenswrapper[4990]: E1003 10:01:13.599230 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372689e3-d306-4bc7-86eb-b44920f77a78" containerName="mariadb-account-create" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.599251 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="372689e3-d306-4bc7-86eb-b44920f77a78" containerName="mariadb-account-create" Oct 03 10:01:13 crc kubenswrapper[4990]: E1003 10:01:13.599263 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" containerName="swift-ring-rebalance" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.599271 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" containerName="swift-ring-rebalance" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.599491 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="372689e3-d306-4bc7-86eb-b44920f77a78" containerName="mariadb-account-create" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.599535 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" containerName="swift-ring-rebalance" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.600181 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.603237 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.605851 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bw7f6" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.611893 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4h7gg"] Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.759436 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4grm2\" (UniqueName: \"kubernetes.io/projected/b48916be-afdb-47ca-8eed-d7ad817883b3-kube-api-access-4grm2\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.759506 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-db-sync-config-data\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.759571 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-config-data\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.759714 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-combined-ca-bundle\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.861687 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4grm2\" (UniqueName: \"kubernetes.io/projected/b48916be-afdb-47ca-8eed-d7ad817883b3-kube-api-access-4grm2\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.861757 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-db-sync-config-data\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.861796 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-config-data\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.861868 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-combined-ca-bundle\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.869422 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-config-data\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.869434 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-combined-ca-bundle\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.877347 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-db-sync-config-data\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.877538 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4grm2\" (UniqueName: \"kubernetes.io/projected/b48916be-afdb-47ca-8eed-d7ad817883b3-kube-api-access-4grm2\") pod \"glance-db-sync-4h7gg\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:13 crc kubenswrapper[4990]: I1003 10:01:13.918725 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:14 crc kubenswrapper[4990]: I1003 10:01:14.453819 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4h7gg"] Oct 03 10:01:14 crc kubenswrapper[4990]: I1003 10:01:14.760564 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4h7gg" event={"ID":"b48916be-afdb-47ca-8eed-d7ad817883b3","Type":"ContainerStarted","Data":"334902d3b927a6863c28d45693d5f9c82d8d0e90a50fb639a9e1597b9287ec44"} Oct 03 10:01:14 crc kubenswrapper[4990]: I1003 10:01:14.793083 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:01:14 crc kubenswrapper[4990]: I1003 10:01:14.794175 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nfzkg" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" probeResult="failure" output=< Oct 03 10:01:14 crc kubenswrapper[4990]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 10:01:14 crc kubenswrapper[4990]: > Oct 03 10:01:17 crc kubenswrapper[4990]: I1003 10:01:17.937854 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1c53-account-create-j26hk"] Oct 03 10:01:17 crc kubenswrapper[4990]: I1003 10:01:17.939476 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1c53-account-create-j26hk" Oct 03 10:01:17 crc kubenswrapper[4990]: I1003 10:01:17.941693 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 10:01:17 crc kubenswrapper[4990]: I1003 10:01:17.945852 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1c53-account-create-j26hk"] Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.030749 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzqgn\" (UniqueName: \"kubernetes.io/projected/e3921c9c-8444-4c5f-814b-f6a26a9cf5df-kube-api-access-zzqgn\") pod \"keystone-1c53-account-create-j26hk\" (UID: \"e3921c9c-8444-4c5f-814b-f6a26a9cf5df\") " pod="openstack/keystone-1c53-account-create-j26hk" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.132996 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzqgn\" (UniqueName: \"kubernetes.io/projected/e3921c9c-8444-4c5f-814b-f6a26a9cf5df-kube-api-access-zzqgn\") pod \"keystone-1c53-account-create-j26hk\" (UID: \"e3921c9c-8444-4c5f-814b-f6a26a9cf5df\") " pod="openstack/keystone-1c53-account-create-j26hk" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.140927 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-20bd-account-create-dxfch"] Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.142629 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-20bd-account-create-dxfch" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.145049 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.149554 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-20bd-account-create-dxfch"] Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.159029 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzqgn\" (UniqueName: \"kubernetes.io/projected/e3921c9c-8444-4c5f-814b-f6a26a9cf5df-kube-api-access-zzqgn\") pod \"keystone-1c53-account-create-j26hk\" (UID: \"e3921c9c-8444-4c5f-814b-f6a26a9cf5df\") " pod="openstack/keystone-1c53-account-create-j26hk" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.234119 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5x5z\" (UniqueName: \"kubernetes.io/projected/feb8b5ec-6556-4999-a5a9-4f1e22dc4140-kube-api-access-x5x5z\") pod \"placement-20bd-account-create-dxfch\" (UID: \"feb8b5ec-6556-4999-a5a9-4f1e22dc4140\") " pod="openstack/placement-20bd-account-create-dxfch" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.263646 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1c53-account-create-j26hk" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.335361 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5x5z\" (UniqueName: \"kubernetes.io/projected/feb8b5ec-6556-4999-a5a9-4f1e22dc4140-kube-api-access-x5x5z\") pod \"placement-20bd-account-create-dxfch\" (UID: \"feb8b5ec-6556-4999-a5a9-4f1e22dc4140\") " pod="openstack/placement-20bd-account-create-dxfch" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.354773 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5x5z\" (UniqueName: \"kubernetes.io/projected/feb8b5ec-6556-4999-a5a9-4f1e22dc4140-kube-api-access-x5x5z\") pod \"placement-20bd-account-create-dxfch\" (UID: \"feb8b5ec-6556-4999-a5a9-4f1e22dc4140\") " pod="openstack/placement-20bd-account-create-dxfch" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.458925 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-20bd-account-create-dxfch" Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.690356 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1c53-account-create-j26hk"] Oct 03 10:01:18 crc kubenswrapper[4990]: W1003 10:01:18.694247 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3921c9c_8444_4c5f_814b_f6a26a9cf5df.slice/crio-3e4ebd596d9c297df69515008ffa9f32076a2028a978334301e2bcdb188d571d WatchSource:0}: Error finding container 3e4ebd596d9c297df69515008ffa9f32076a2028a978334301e2bcdb188d571d: Status 404 returned error can't find the container with id 3e4ebd596d9c297df69515008ffa9f32076a2028a978334301e2bcdb188d571d Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.794304 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1c53-account-create-j26hk" event={"ID":"e3921c9c-8444-4c5f-814b-f6a26a9cf5df","Type":"ContainerStarted","Data":"3e4ebd596d9c297df69515008ffa9f32076a2028a978334301e2bcdb188d571d"} Oct 03 10:01:18 crc kubenswrapper[4990]: I1003 10:01:18.888595 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-20bd-account-create-dxfch"] Oct 03 10:01:19 crc kubenswrapper[4990]: I1003 10:01:19.796648 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:01:19 crc kubenswrapper[4990]: I1003 10:01:19.799985 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nfzkg" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" probeResult="failure" output=< Oct 03 10:01:19 crc kubenswrapper[4990]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 10:01:19 crc kubenswrapper[4990]: > Oct 03 10:01:19 crc kubenswrapper[4990]: I1003 10:01:19.806432 4990 generic.go:334] "Generic (PLEG): container finished" podID="e3921c9c-8444-4c5f-814b-f6a26a9cf5df" containerID="5218c4ea0a9ac0c5d479f39fef59faf1bab332d65ddb4788247bc0adbb6de898" exitCode=0 Oct 03 10:01:19 crc kubenswrapper[4990]: I1003 10:01:19.806475 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1c53-account-create-j26hk" event={"ID":"e3921c9c-8444-4c5f-814b-f6a26a9cf5df","Type":"ContainerDied","Data":"5218c4ea0a9ac0c5d479f39fef59faf1bab332d65ddb4788247bc0adbb6de898"} Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.009053 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nfzkg-config-vt6h8"] Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.010196 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.012403 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.019257 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfzkg-config-vt6h8"] Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.188952 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.189030 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg2wn\" (UniqueName: \"kubernetes.io/projected/da35976c-7da5-428a-bb8d-10c55388118f-kube-api-access-pg2wn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.189300 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-log-ovn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.189370 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run-ovn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.189392 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-scripts\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.189415 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-additional-scripts\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.290876 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-log-ovn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.290930 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-scripts\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.290948 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run-ovn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.290967 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-additional-scripts\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.291003 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.291043 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg2wn\" (UniqueName: \"kubernetes.io/projected/da35976c-7da5-428a-bb8d-10c55388118f-kube-api-access-pg2wn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.291686 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-log-ovn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.291761 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.291849 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run-ovn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.293943 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-scripts\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.294416 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-additional-scripts\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.314284 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg2wn\" (UniqueName: \"kubernetes.io/projected/da35976c-7da5-428a-bb8d-10c55388118f-kube-api-access-pg2wn\") pod \"ovn-controller-nfzkg-config-vt6h8\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:20 crc kubenswrapper[4990]: I1003 10:01:20.349919 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:23 crc kubenswrapper[4990]: I1003 10:01:23.243562 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:01:23 crc kubenswrapper[4990]: I1003 10:01:23.251556 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"swift-storage-0\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " pod="openstack/swift-storage-0" Oct 03 10:01:23 crc kubenswrapper[4990]: I1003 10:01:23.533496 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 10:01:24 crc kubenswrapper[4990]: I1003 10:01:24.787955 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nfzkg" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" probeResult="failure" output=< Oct 03 10:01:24 crc kubenswrapper[4990]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 10:01:24 crc kubenswrapper[4990]: > Oct 03 10:01:24 crc kubenswrapper[4990]: I1003 10:01:24.913734 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.215827 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.240273 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tl56s"] Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.241557 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tl56s" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.252667 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tl56s"] Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.303601 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.303887 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:01:25 crc kubenswrapper[4990]: W1003 10:01:25.329451 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb8b5ec_6556_4999_a5a9_4f1e22dc4140.slice/crio-30ef010dd2c0b00146425bed9551616bdb2e9b5063546d60f75cf153e6b4e599 WatchSource:0}: Error finding container 30ef010dd2c0b00146425bed9551616bdb2e9b5063546d60f75cf153e6b4e599: Status 404 returned error can't find the container with id 30ef010dd2c0b00146425bed9551616bdb2e9b5063546d60f75cf153e6b4e599 Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.378149 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5gv6z"] Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.379420 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5gv6z" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.380013 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdctz\" (UniqueName: \"kubernetes.io/projected/1e549854-6717-4898-a5ee-aca6972206a7-kube-api-access-bdctz\") pod \"cinder-db-create-tl56s\" (UID: \"1e549854-6717-4898-a5ee-aca6972206a7\") " pod="openstack/cinder-db-create-tl56s" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.389806 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5gv6z"] Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.481352 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfdr9\" (UniqueName: \"kubernetes.io/projected/88e71897-19ff-43cc-86cb-e2c5c2bce780-kube-api-access-bfdr9\") pod \"barbican-db-create-5gv6z\" (UID: \"88e71897-19ff-43cc-86cb-e2c5c2bce780\") " pod="openstack/barbican-db-create-5gv6z" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.481406 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdctz\" (UniqueName: \"kubernetes.io/projected/1e549854-6717-4898-a5ee-aca6972206a7-kube-api-access-bdctz\") pod \"cinder-db-create-tl56s\" (UID: \"1e549854-6717-4898-a5ee-aca6972206a7\") " pod="openstack/cinder-db-create-tl56s" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.487727 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fblmx"] Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.489030 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fblmx" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.489852 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1c53-account-create-j26hk" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.499487 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fblmx"] Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.517423 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdctz\" (UniqueName: \"kubernetes.io/projected/1e549854-6717-4898-a5ee-aca6972206a7-kube-api-access-bdctz\") pod \"cinder-db-create-tl56s\" (UID: \"1e549854-6717-4898-a5ee-aca6972206a7\") " pod="openstack/cinder-db-create-tl56s" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.569784 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tl56s" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.583830 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzqgn\" (UniqueName: \"kubernetes.io/projected/e3921c9c-8444-4c5f-814b-f6a26a9cf5df-kube-api-access-zzqgn\") pod \"e3921c9c-8444-4c5f-814b-f6a26a9cf5df\" (UID: \"e3921c9c-8444-4c5f-814b-f6a26a9cf5df\") " Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.584178 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wzx6\" (UniqueName: \"kubernetes.io/projected/dc1b0e32-43df-49be-944a-ad51d76dbf32-kube-api-access-8wzx6\") pod \"neutron-db-create-fblmx\" (UID: \"dc1b0e32-43df-49be-944a-ad51d76dbf32\") " pod="openstack/neutron-db-create-fblmx" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.584209 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfdr9\" (UniqueName: \"kubernetes.io/projected/88e71897-19ff-43cc-86cb-e2c5c2bce780-kube-api-access-bfdr9\") pod \"barbican-db-create-5gv6z\" (UID: \"88e71897-19ff-43cc-86cb-e2c5c2bce780\") " pod="openstack/barbican-db-create-5gv6z" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.600829 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfdr9\" (UniqueName: \"kubernetes.io/projected/88e71897-19ff-43cc-86cb-e2c5c2bce780-kube-api-access-bfdr9\") pod \"barbican-db-create-5gv6z\" (UID: \"88e71897-19ff-43cc-86cb-e2c5c2bce780\") " pod="openstack/barbican-db-create-5gv6z" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.601313 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3921c9c-8444-4c5f-814b-f6a26a9cf5df-kube-api-access-zzqgn" (OuterVolumeSpecName: "kube-api-access-zzqgn") pod "e3921c9c-8444-4c5f-814b-f6a26a9cf5df" (UID: "e3921c9c-8444-4c5f-814b-f6a26a9cf5df"). InnerVolumeSpecName "kube-api-access-zzqgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.640762 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5gv6z" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.686022 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wzx6\" (UniqueName: \"kubernetes.io/projected/dc1b0e32-43df-49be-944a-ad51d76dbf32-kube-api-access-8wzx6\") pod \"neutron-db-create-fblmx\" (UID: \"dc1b0e32-43df-49be-944a-ad51d76dbf32\") " pod="openstack/neutron-db-create-fblmx" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.686229 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzqgn\" (UniqueName: \"kubernetes.io/projected/e3921c9c-8444-4c5f-814b-f6a26a9cf5df-kube-api-access-zzqgn\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.712161 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wzx6\" (UniqueName: \"kubernetes.io/projected/dc1b0e32-43df-49be-944a-ad51d76dbf32-kube-api-access-8wzx6\") pod \"neutron-db-create-fblmx\" (UID: \"dc1b0e32-43df-49be-944a-ad51d76dbf32\") " pod="openstack/neutron-db-create-fblmx" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.874184 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1c53-account-create-j26hk" event={"ID":"e3921c9c-8444-4c5f-814b-f6a26a9cf5df","Type":"ContainerDied","Data":"3e4ebd596d9c297df69515008ffa9f32076a2028a978334301e2bcdb188d571d"} Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.874216 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e4ebd596d9c297df69515008ffa9f32076a2028a978334301e2bcdb188d571d" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.874252 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1c53-account-create-j26hk" Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.877177 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-20bd-account-create-dxfch" event={"ID":"feb8b5ec-6556-4999-a5a9-4f1e22dc4140","Type":"ContainerStarted","Data":"30ef010dd2c0b00146425bed9551616bdb2e9b5063546d60f75cf153e6b4e599"} Oct 03 10:01:25 crc kubenswrapper[4990]: I1003 10:01:25.967226 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fblmx" Oct 03 10:01:26 crc kubenswrapper[4990]: I1003 10:01:26.017626 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfzkg-config-vt6h8"] Oct 03 10:01:26 crc kubenswrapper[4990]: I1003 10:01:26.059206 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 10:01:26 crc kubenswrapper[4990]: I1003 10:01:26.093452 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tl56s"] Oct 03 10:01:26 crc kubenswrapper[4990]: I1003 10:01:26.218769 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5gv6z"] Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.551329 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vc47v"] Oct 03 10:01:28 crc kubenswrapper[4990]: E1003 10:01:28.552050 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3921c9c-8444-4c5f-814b-f6a26a9cf5df" containerName="mariadb-account-create" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.552064 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3921c9c-8444-4c5f-814b-f6a26a9cf5df" containerName="mariadb-account-create" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.552246 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3921c9c-8444-4c5f-814b-f6a26a9cf5df" containerName="mariadb-account-create" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.552853 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.554765 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kk8mw" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.555043 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.555403 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.559747 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.575426 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vc47v"] Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.640187 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfjpg\" (UniqueName: \"kubernetes.io/projected/edcff799-1dbf-4115-9f05-3e5164b331ad-kube-api-access-tfjpg\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.640342 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-config-data\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.640405 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-combined-ca-bundle\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.742300 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-config-data\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.742378 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-combined-ca-bundle\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.742441 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfjpg\" (UniqueName: \"kubernetes.io/projected/edcff799-1dbf-4115-9f05-3e5164b331ad-kube-api-access-tfjpg\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.753428 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-config-data\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.754113 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-combined-ca-bundle\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.758838 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfjpg\" (UniqueName: \"kubernetes.io/projected/edcff799-1dbf-4115-9f05-3e5164b331ad-kube-api-access-tfjpg\") pod \"keystone-db-sync-vc47v\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:28 crc kubenswrapper[4990]: I1003 10:01:28.869939 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:29 crc kubenswrapper[4990]: W1003 10:01:29.153586 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcca92a2a_2e3d_4e52_8ed8_a4dc709915b6.slice/crio-127cc36df918eeae38ea1d8c57b81fbff89e450fc2bd33ee21659f582a24901b WatchSource:0}: Error finding container 127cc36df918eeae38ea1d8c57b81fbff89e450fc2bd33ee21659f582a24901b: Status 404 returned error can't find the container with id 127cc36df918eeae38ea1d8c57b81fbff89e450fc2bd33ee21659f582a24901b Oct 03 10:01:29 crc kubenswrapper[4990]: W1003 10:01:29.160266 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda35976c_7da5_428a_bb8d_10c55388118f.slice/crio-301d5d6e90d60cdf2a8d4a5b882c865f2e106529d9ef80ffb4fa628ec8a1d894 WatchSource:0}: Error finding container 301d5d6e90d60cdf2a8d4a5b882c865f2e106529d9ef80ffb4fa628ec8a1d894: Status 404 returned error can't find the container with id 301d5d6e90d60cdf2a8d4a5b882c865f2e106529d9ef80ffb4fa628ec8a1d894 Oct 03 10:01:29 crc kubenswrapper[4990]: W1003 10:01:29.163479 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e549854_6717_4898_a5ee_aca6972206a7.slice/crio-4b4097327bc41b295df1eba5664621862db88a2b31a9febae8f48bff06419604 WatchSource:0}: Error finding container 4b4097327bc41b295df1eba5664621862db88a2b31a9febae8f48bff06419604: Status 404 returned error can't find the container with id 4b4097327bc41b295df1eba5664621862db88a2b31a9febae8f48bff06419604 Oct 03 10:01:29 crc kubenswrapper[4990]: W1003 10:01:29.167148 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88e71897_19ff_43cc_86cb_e2c5c2bce780.slice/crio-113739c8753f4ea43e33e6d7955ac6bea9177748f30f15993aff2ae507fbef2a WatchSource:0}: Error finding container 113739c8753f4ea43e33e6d7955ac6bea9177748f30f15993aff2ae507fbef2a: Status 404 returned error can't find the container with id 113739c8753f4ea43e33e6d7955ac6bea9177748f30f15993aff2ae507fbef2a Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.660061 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vc47v"] Oct 03 10:01:29 crc kubenswrapper[4990]: W1003 10:01:29.661989 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedcff799_1dbf_4115_9f05_3e5164b331ad.slice/crio-0ca36ff5eaba8c95cfd58516ef630b75bb1e4f4b362f456079ef80a59b29c0cd WatchSource:0}: Error finding container 0ca36ff5eaba8c95cfd58516ef630b75bb1e4f4b362f456079ef80a59b29c0cd: Status 404 returned error can't find the container with id 0ca36ff5eaba8c95cfd58516ef630b75bb1e4f4b362f456079ef80a59b29c0cd Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.734374 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fblmx"] Oct 03 10:01:29 crc kubenswrapper[4990]: W1003 10:01:29.739084 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc1b0e32_43df_49be_944a_ad51d76dbf32.slice/crio-e106ed772b4f64a23484479a0d7c97bdc2a099f423abb4aed70f662f11b71731 WatchSource:0}: Error finding container e106ed772b4f64a23484479a0d7c97bdc2a099f423abb4aed70f662f11b71731: Status 404 returned error can't find the container with id e106ed772b4f64a23484479a0d7c97bdc2a099f423abb4aed70f662f11b71731 Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.813716 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nfzkg" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" probeResult="failure" output=< Oct 03 10:01:29 crc kubenswrapper[4990]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 10:01:29 crc kubenswrapper[4990]: > Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.912322 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fblmx" event={"ID":"dc1b0e32-43df-49be-944a-ad51d76dbf32","Type":"ContainerStarted","Data":"e106ed772b4f64a23484479a0d7c97bdc2a099f423abb4aed70f662f11b71731"} Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.914018 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-20bd-account-create-dxfch" event={"ID":"feb8b5ec-6556-4999-a5a9-4f1e22dc4140","Type":"ContainerStarted","Data":"a97875ccc65a4d94561b8c329188369807b3b4c23aac3230cdb73d4704ca67de"} Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.915115 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"127cc36df918eeae38ea1d8c57b81fbff89e450fc2bd33ee21659f582a24901b"} Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.916735 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tl56s" event={"ID":"1e549854-6717-4898-a5ee-aca6972206a7","Type":"ContainerStarted","Data":"4b4097327bc41b295df1eba5664621862db88a2b31a9febae8f48bff06419604"} Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.918972 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg-config-vt6h8" event={"ID":"da35976c-7da5-428a-bb8d-10c55388118f","Type":"ContainerStarted","Data":"301d5d6e90d60cdf2a8d4a5b882c865f2e106529d9ef80ffb4fa628ec8a1d894"} Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.921024 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5gv6z" event={"ID":"88e71897-19ff-43cc-86cb-e2c5c2bce780","Type":"ContainerStarted","Data":"113739c8753f4ea43e33e6d7955ac6bea9177748f30f15993aff2ae507fbef2a"} Oct 03 10:01:29 crc kubenswrapper[4990]: I1003 10:01:29.922529 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vc47v" event={"ID":"edcff799-1dbf-4115-9f05-3e5164b331ad","Type":"ContainerStarted","Data":"0ca36ff5eaba8c95cfd58516ef630b75bb1e4f4b362f456079ef80a59b29c0cd"} Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.941446 4990 generic.go:334] "Generic (PLEG): container finished" podID="dc1b0e32-43df-49be-944a-ad51d76dbf32" containerID="1ca4954d0191dc08f83434b422e932c5bfb8ed7e5628dd4ee7b78dfa04d4c8c4" exitCode=0 Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.941628 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fblmx" event={"ID":"dc1b0e32-43df-49be-944a-ad51d76dbf32","Type":"ContainerDied","Data":"1ca4954d0191dc08f83434b422e932c5bfb8ed7e5628dd4ee7b78dfa04d4c8c4"} Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.944358 4990 generic.go:334] "Generic (PLEG): container finished" podID="feb8b5ec-6556-4999-a5a9-4f1e22dc4140" containerID="a97875ccc65a4d94561b8c329188369807b3b4c23aac3230cdb73d4704ca67de" exitCode=0 Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.944414 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-20bd-account-create-dxfch" event={"ID":"feb8b5ec-6556-4999-a5a9-4f1e22dc4140","Type":"ContainerDied","Data":"a97875ccc65a4d94561b8c329188369807b3b4c23aac3230cdb73d4704ca67de"} Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.946273 4990 generic.go:334] "Generic (PLEG): container finished" podID="1e549854-6717-4898-a5ee-aca6972206a7" containerID="fcf62b1283b33d38aeee0f2d91a2e4ca85cd88ae3945b5041a9e1aea0b6d1774" exitCode=0 Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.946320 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tl56s" event={"ID":"1e549854-6717-4898-a5ee-aca6972206a7","Type":"ContainerDied","Data":"fcf62b1283b33d38aeee0f2d91a2e4ca85cd88ae3945b5041a9e1aea0b6d1774"} Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.948432 4990 generic.go:334] "Generic (PLEG): container finished" podID="da35976c-7da5-428a-bb8d-10c55388118f" containerID="1f910d8c95b928da975f02b11022b10642c780d91d0a9a470d6c5f0f5abd6093" exitCode=0 Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.948602 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg-config-vt6h8" event={"ID":"da35976c-7da5-428a-bb8d-10c55388118f","Type":"ContainerDied","Data":"1f910d8c95b928da975f02b11022b10642c780d91d0a9a470d6c5f0f5abd6093"} Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.950159 4990 generic.go:334] "Generic (PLEG): container finished" podID="88e71897-19ff-43cc-86cb-e2c5c2bce780" containerID="1633e1faa9b6a2e02ae1af8387687c60cead6ef65411004b7d61911b8219235d" exitCode=0 Oct 03 10:01:31 crc kubenswrapper[4990]: I1003 10:01:31.950223 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5gv6z" event={"ID":"88e71897-19ff-43cc-86cb-e2c5c2bce780","Type":"ContainerDied","Data":"1633e1faa9b6a2e02ae1af8387687c60cead6ef65411004b7d61911b8219235d"} Oct 03 10:01:32 crc kubenswrapper[4990]: I1003 10:01:32.960223 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182"} Oct 03 10:01:32 crc kubenswrapper[4990]: I1003 10:01:32.960749 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555"} Oct 03 10:01:32 crc kubenswrapper[4990]: I1003 10:01:32.960761 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301"} Oct 03 10:01:32 crc kubenswrapper[4990]: I1003 10:01:32.963105 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4h7gg" event={"ID":"b48916be-afdb-47ca-8eed-d7ad817883b3","Type":"ContainerStarted","Data":"fd9e74c91e6a08463958110965cb2cda9b429bda7c5d992f466ea00d8861ed7b"} Oct 03 10:01:32 crc kubenswrapper[4990]: I1003 10:01:32.984166 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-4h7gg" podStartSLOduration=3.1542355 podStartE2EDuration="19.984144475s" podCreationTimestamp="2025-10-03 10:01:13 +0000 UTC" firstStartedPulling="2025-10-03 10:01:14.463940916 +0000 UTC m=+1056.260572773" lastFinishedPulling="2025-10-03 10:01:31.293849891 +0000 UTC m=+1073.090481748" observedRunningTime="2025-10-03 10:01:32.980328352 +0000 UTC m=+1074.776960229" watchObservedRunningTime="2025-10-03 10:01:32.984144475 +0000 UTC m=+1074.780776332" Oct 03 10:01:33 crc kubenswrapper[4990]: I1003 10:01:33.988019 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7"} Oct 03 10:01:34 crc kubenswrapper[4990]: I1003 10:01:34.786932 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nfzkg" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.420370 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fblmx" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.426735 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.461592 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-additional-scripts\") pod \"da35976c-7da5-428a-bb8d-10c55388118f\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.461679 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-log-ovn\") pod \"da35976c-7da5-428a-bb8d-10c55388118f\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.461757 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run\") pod \"da35976c-7da5-428a-bb8d-10c55388118f\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.461848 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run-ovn\") pod \"da35976c-7da5-428a-bb8d-10c55388118f\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.461895 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wzx6\" (UniqueName: \"kubernetes.io/projected/dc1b0e32-43df-49be-944a-ad51d76dbf32-kube-api-access-8wzx6\") pod \"dc1b0e32-43df-49be-944a-ad51d76dbf32\" (UID: \"dc1b0e32-43df-49be-944a-ad51d76dbf32\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.461944 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-scripts\") pod \"da35976c-7da5-428a-bb8d-10c55388118f\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.461965 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg2wn\" (UniqueName: \"kubernetes.io/projected/da35976c-7da5-428a-bb8d-10c55388118f-kube-api-access-pg2wn\") pod \"da35976c-7da5-428a-bb8d-10c55388118f\" (UID: \"da35976c-7da5-428a-bb8d-10c55388118f\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.463127 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run" (OuterVolumeSpecName: "var-run") pod "da35976c-7da5-428a-bb8d-10c55388118f" (UID: "da35976c-7da5-428a-bb8d-10c55388118f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.463166 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "da35976c-7da5-428a-bb8d-10c55388118f" (UID: "da35976c-7da5-428a-bb8d-10c55388118f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.463187 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "da35976c-7da5-428a-bb8d-10c55388118f" (UID: "da35976c-7da5-428a-bb8d-10c55388118f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.463801 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "da35976c-7da5-428a-bb8d-10c55388118f" (UID: "da35976c-7da5-428a-bb8d-10c55388118f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.464021 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-scripts" (OuterVolumeSpecName: "scripts") pod "da35976c-7da5-428a-bb8d-10c55388118f" (UID: "da35976c-7da5-428a-bb8d-10c55388118f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.468231 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da35976c-7da5-428a-bb8d-10c55388118f-kube-api-access-pg2wn" (OuterVolumeSpecName: "kube-api-access-pg2wn") pod "da35976c-7da5-428a-bb8d-10c55388118f" (UID: "da35976c-7da5-428a-bb8d-10c55388118f"). InnerVolumeSpecName "kube-api-access-pg2wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.468330 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1b0e32-43df-49be-944a-ad51d76dbf32-kube-api-access-8wzx6" (OuterVolumeSpecName: "kube-api-access-8wzx6") pod "dc1b0e32-43df-49be-944a-ad51d76dbf32" (UID: "dc1b0e32-43df-49be-944a-ad51d76dbf32"). InnerVolumeSpecName "kube-api-access-8wzx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.563802 4990 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.563860 4990 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.563871 4990 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.563881 4990 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da35976c-7da5-428a-bb8d-10c55388118f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.564015 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wzx6\" (UniqueName: \"kubernetes.io/projected/dc1b0e32-43df-49be-944a-ad51d76dbf32-kube-api-access-8wzx6\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.564030 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da35976c-7da5-428a-bb8d-10c55388118f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.564038 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg2wn\" (UniqueName: \"kubernetes.io/projected/da35976c-7da5-428a-bb8d-10c55388118f-kube-api-access-pg2wn\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.597285 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5gv6z" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.605544 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-20bd-account-create-dxfch" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.645786 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tl56s" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.665746 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5x5z\" (UniqueName: \"kubernetes.io/projected/feb8b5ec-6556-4999-a5a9-4f1e22dc4140-kube-api-access-x5x5z\") pod \"feb8b5ec-6556-4999-a5a9-4f1e22dc4140\" (UID: \"feb8b5ec-6556-4999-a5a9-4f1e22dc4140\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.665912 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfdr9\" (UniqueName: \"kubernetes.io/projected/88e71897-19ff-43cc-86cb-e2c5c2bce780-kube-api-access-bfdr9\") pod \"88e71897-19ff-43cc-86cb-e2c5c2bce780\" (UID: \"88e71897-19ff-43cc-86cb-e2c5c2bce780\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.670502 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e71897-19ff-43cc-86cb-e2c5c2bce780-kube-api-access-bfdr9" (OuterVolumeSpecName: "kube-api-access-bfdr9") pod "88e71897-19ff-43cc-86cb-e2c5c2bce780" (UID: "88e71897-19ff-43cc-86cb-e2c5c2bce780"). InnerVolumeSpecName "kube-api-access-bfdr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.671077 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb8b5ec-6556-4999-a5a9-4f1e22dc4140-kube-api-access-x5x5z" (OuterVolumeSpecName: "kube-api-access-x5x5z") pod "feb8b5ec-6556-4999-a5a9-4f1e22dc4140" (UID: "feb8b5ec-6556-4999-a5a9-4f1e22dc4140"). InnerVolumeSpecName "kube-api-access-x5x5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.767839 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdctz\" (UniqueName: \"kubernetes.io/projected/1e549854-6717-4898-a5ee-aca6972206a7-kube-api-access-bdctz\") pod \"1e549854-6717-4898-a5ee-aca6972206a7\" (UID: \"1e549854-6717-4898-a5ee-aca6972206a7\") " Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.768156 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfdr9\" (UniqueName: \"kubernetes.io/projected/88e71897-19ff-43cc-86cb-e2c5c2bce780-kube-api-access-bfdr9\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.768169 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5x5z\" (UniqueName: \"kubernetes.io/projected/feb8b5ec-6556-4999-a5a9-4f1e22dc4140-kube-api-access-x5x5z\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.770801 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e549854-6717-4898-a5ee-aca6972206a7-kube-api-access-bdctz" (OuterVolumeSpecName: "kube-api-access-bdctz") pod "1e549854-6717-4898-a5ee-aca6972206a7" (UID: "1e549854-6717-4898-a5ee-aca6972206a7"). InnerVolumeSpecName "kube-api-access-bdctz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:35 crc kubenswrapper[4990]: I1003 10:01:35.870062 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdctz\" (UniqueName: \"kubernetes.io/projected/1e549854-6717-4898-a5ee-aca6972206a7-kube-api-access-bdctz\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.018890 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-20bd-account-create-dxfch" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.018946 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-20bd-account-create-dxfch" event={"ID":"feb8b5ec-6556-4999-a5a9-4f1e22dc4140","Type":"ContainerDied","Data":"30ef010dd2c0b00146425bed9551616bdb2e9b5063546d60f75cf153e6b4e599"} Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.019017 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30ef010dd2c0b00146425bed9551616bdb2e9b5063546d60f75cf153e6b4e599" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.021368 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tl56s" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.021633 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tl56s" event={"ID":"1e549854-6717-4898-a5ee-aca6972206a7","Type":"ContainerDied","Data":"4b4097327bc41b295df1eba5664621862db88a2b31a9febae8f48bff06419604"} Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.021814 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b4097327bc41b295df1eba5664621862db88a2b31a9febae8f48bff06419604" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.023857 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg-config-vt6h8" event={"ID":"da35976c-7da5-428a-bb8d-10c55388118f","Type":"ContainerDied","Data":"301d5d6e90d60cdf2a8d4a5b882c865f2e106529d9ef80ffb4fa628ec8a1d894"} Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.024018 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="301d5d6e90d60cdf2a8d4a5b882c865f2e106529d9ef80ffb4fa628ec8a1d894" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.024213 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg-config-vt6h8" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.036102 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5gv6z" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.036153 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5gv6z" event={"ID":"88e71897-19ff-43cc-86cb-e2c5c2bce780","Type":"ContainerDied","Data":"113739c8753f4ea43e33e6d7955ac6bea9177748f30f15993aff2ae507fbef2a"} Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.036204 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="113739c8753f4ea43e33e6d7955ac6bea9177748f30f15993aff2ae507fbef2a" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.042413 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vc47v" event={"ID":"edcff799-1dbf-4115-9f05-3e5164b331ad","Type":"ContainerStarted","Data":"605055cfc80d36f87f85bf950602b2ae0f9b1559ba2ee7e06b749c78a8b9e130"} Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.044450 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fblmx" event={"ID":"dc1b0e32-43df-49be-944a-ad51d76dbf32","Type":"ContainerDied","Data":"e106ed772b4f64a23484479a0d7c97bdc2a099f423abb4aed70f662f11b71731"} Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.044480 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e106ed772b4f64a23484479a0d7c97bdc2a099f423abb4aed70f662f11b71731" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.044578 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fblmx" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.079444 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vc47v" podStartSLOduration=2.40008322 podStartE2EDuration="8.07942756s" podCreationTimestamp="2025-10-03 10:01:28 +0000 UTC" firstStartedPulling="2025-10-03 10:01:29.66410062 +0000 UTC m=+1071.460732477" lastFinishedPulling="2025-10-03 10:01:35.34344496 +0000 UTC m=+1077.140076817" observedRunningTime="2025-10-03 10:01:36.067874217 +0000 UTC m=+1077.864506074" watchObservedRunningTime="2025-10-03 10:01:36.07942756 +0000 UTC m=+1077.876059417" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.542425 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nfzkg-config-vt6h8"] Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.554051 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nfzkg-config-vt6h8"] Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.646242 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nfzkg-config-nzt5m"] Oct 03 10:01:36 crc kubenswrapper[4990]: E1003 10:01:36.646844 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e71897-19ff-43cc-86cb-e2c5c2bce780" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.646860 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e71897-19ff-43cc-86cb-e2c5c2bce780" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: E1003 10:01:36.646872 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb8b5ec-6556-4999-a5a9-4f1e22dc4140" containerName="mariadb-account-create" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.646880 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb8b5ec-6556-4999-a5a9-4f1e22dc4140" containerName="mariadb-account-create" Oct 03 10:01:36 crc kubenswrapper[4990]: E1003 10:01:36.646895 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e549854-6717-4898-a5ee-aca6972206a7" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.646902 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e549854-6717-4898-a5ee-aca6972206a7" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: E1003 10:01:36.646933 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1b0e32-43df-49be-944a-ad51d76dbf32" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.646940 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1b0e32-43df-49be-944a-ad51d76dbf32" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: E1003 10:01:36.646947 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da35976c-7da5-428a-bb8d-10c55388118f" containerName="ovn-config" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.646956 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="da35976c-7da5-428a-bb8d-10c55388118f" containerName="ovn-config" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.647173 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="da35976c-7da5-428a-bb8d-10c55388118f" containerName="ovn-config" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.647204 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e71897-19ff-43cc-86cb-e2c5c2bce780" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.647218 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb8b5ec-6556-4999-a5a9-4f1e22dc4140" containerName="mariadb-account-create" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.647234 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e549854-6717-4898-a5ee-aca6972206a7" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.647251 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1b0e32-43df-49be-944a-ad51d76dbf32" containerName="mariadb-database-create" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.648074 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.652156 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.670482 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfzkg-config-nzt5m"] Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.683739 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-additional-scripts\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.683926 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-log-ovn\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.683988 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-scripts\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.684059 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run-ovn\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.684125 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpxx7\" (UniqueName: \"kubernetes.io/projected/56254a4a-a677-4674-9740-f9da26cef2e4-kube-api-access-wpxx7\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.684156 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.785916 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-scripts\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.785994 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run-ovn\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.786030 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxx7\" (UniqueName: \"kubernetes.io/projected/56254a4a-a677-4674-9740-f9da26cef2e4-kube-api-access-wpxx7\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.786054 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.786108 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-additional-scripts\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.786169 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-log-ovn\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.786322 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-log-ovn\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.786321 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run-ovn\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.786375 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.786997 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-additional-scripts\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.797874 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-scripts\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.805976 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpxx7\" (UniqueName: \"kubernetes.io/projected/56254a4a-a677-4674-9740-f9da26cef2e4-kube-api-access-wpxx7\") pod \"ovn-controller-nfzkg-config-nzt5m\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.882056 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da35976c-7da5-428a-bb8d-10c55388118f" path="/var/lib/kubelet/pods/da35976c-7da5-428a-bb8d-10c55388118f/volumes" Oct 03 10:01:36 crc kubenswrapper[4990]: I1003 10:01:36.966884 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:37 crc kubenswrapper[4990]: I1003 10:01:37.065243 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4"} Oct 03 10:01:37 crc kubenswrapper[4990]: I1003 10:01:37.065317 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab"} Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:37.408812 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfzkg-config-nzt5m"] Oct 03 10:01:39 crc kubenswrapper[4990]: W1003 10:01:37.417085 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56254a4a_a677_4674_9740_f9da26cef2e4.slice/crio-1c16e28b4d4b1adfb0b519235a29c0a0ed242c0c0fb824d290951831388bd1a0 WatchSource:0}: Error finding container 1c16e28b4d4b1adfb0b519235a29c0a0ed242c0c0fb824d290951831388bd1a0: Status 404 returned error can't find the container with id 1c16e28b4d4b1adfb0b519235a29c0a0ed242c0c0fb824d290951831388bd1a0 Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:38.085714 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5"} Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:38.086126 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157"} Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:38.092577 4990 generic.go:334] "Generic (PLEG): container finished" podID="56254a4a-a677-4674-9740-f9da26cef2e4" containerID="f7eaf253d3aa5512f25c1f9d5ffce15dc33c5cf1d6cff9139aebbef8d767c346" exitCode=0 Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:38.092620 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg-config-nzt5m" event={"ID":"56254a4a-a677-4674-9740-f9da26cef2e4","Type":"ContainerDied","Data":"f7eaf253d3aa5512f25c1f9d5ffce15dc33c5cf1d6cff9139aebbef8d767c346"} Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:38.092651 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg-config-nzt5m" event={"ID":"56254a4a-a677-4674-9740-f9da26cef2e4","Type":"ContainerStarted","Data":"1c16e28b4d4b1adfb0b519235a29c0a0ed242c0c0fb824d290951831388bd1a0"} Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.887787 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940047 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpxx7\" (UniqueName: \"kubernetes.io/projected/56254a4a-a677-4674-9740-f9da26cef2e4-kube-api-access-wpxx7\") pod \"56254a4a-a677-4674-9740-f9da26cef2e4\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940087 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run-ovn\") pod \"56254a4a-a677-4674-9740-f9da26cef2e4\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940139 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-additional-scripts\") pod \"56254a4a-a677-4674-9740-f9da26cef2e4\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940179 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run\") pod \"56254a4a-a677-4674-9740-f9da26cef2e4\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940193 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "56254a4a-a677-4674-9740-f9da26cef2e4" (UID: "56254a4a-a677-4674-9740-f9da26cef2e4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940242 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-scripts\") pod \"56254a4a-a677-4674-9740-f9da26cef2e4\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940339 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-log-ovn\") pod \"56254a4a-a677-4674-9740-f9da26cef2e4\" (UID: \"56254a4a-a677-4674-9740-f9da26cef2e4\") " Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940352 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run" (OuterVolumeSpecName: "var-run") pod "56254a4a-a677-4674-9740-f9da26cef2e4" (UID: "56254a4a-a677-4674-9740-f9da26cef2e4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940461 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "56254a4a-a677-4674-9740-f9da26cef2e4" (UID: "56254a4a-a677-4674-9740-f9da26cef2e4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940937 4990 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940959 4990 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940948 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "56254a4a-a677-4674-9740-f9da26cef2e4" (UID: "56254a4a-a677-4674-9740-f9da26cef2e4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.940971 4990 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/56254a4a-a677-4674-9740-f9da26cef2e4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.941744 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-scripts" (OuterVolumeSpecName: "scripts") pod "56254a4a-a677-4674-9740-f9da26cef2e4" (UID: "56254a4a-a677-4674-9740-f9da26cef2e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:39 crc kubenswrapper[4990]: I1003 10:01:39.947195 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56254a4a-a677-4674-9740-f9da26cef2e4-kube-api-access-wpxx7" (OuterVolumeSpecName: "kube-api-access-wpxx7") pod "56254a4a-a677-4674-9740-f9da26cef2e4" (UID: "56254a4a-a677-4674-9740-f9da26cef2e4"). InnerVolumeSpecName "kube-api-access-wpxx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:40 crc kubenswrapper[4990]: I1003 10:01:40.042121 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:40 crc kubenswrapper[4990]: I1003 10:01:40.042154 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpxx7\" (UniqueName: \"kubernetes.io/projected/56254a4a-a677-4674-9740-f9da26cef2e4-kube-api-access-wpxx7\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:40 crc kubenswrapper[4990]: I1003 10:01:40.042169 4990 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/56254a4a-a677-4674-9740-f9da26cef2e4-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:40 crc kubenswrapper[4990]: I1003 10:01:40.114052 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54"} Oct 03 10:01:40 crc kubenswrapper[4990]: I1003 10:01:40.116024 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg-config-nzt5m" event={"ID":"56254a4a-a677-4674-9740-f9da26cef2e4","Type":"ContainerDied","Data":"1c16e28b4d4b1adfb0b519235a29c0a0ed242c0c0fb824d290951831388bd1a0"} Oct 03 10:01:40 crc kubenswrapper[4990]: I1003 10:01:40.116074 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg-config-nzt5m" Oct 03 10:01:40 crc kubenswrapper[4990]: I1003 10:01:40.116081 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c16e28b4d4b1adfb0b519235a29c0a0ed242c0c0fb824d290951831388bd1a0" Oct 03 10:01:40 crc kubenswrapper[4990]: I1003 10:01:40.997468 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nfzkg-config-nzt5m"] Oct 03 10:01:41 crc kubenswrapper[4990]: I1003 10:01:41.006064 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nfzkg-config-nzt5m"] Oct 03 10:01:41 crc kubenswrapper[4990]: I1003 10:01:41.132291 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4"} Oct 03 10:01:41 crc kubenswrapper[4990]: I1003 10:01:41.132337 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1"} Oct 03 10:01:41 crc kubenswrapper[4990]: I1003 10:01:41.132347 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265"} Oct 03 10:01:41 crc kubenswrapper[4990]: I1003 10:01:41.132356 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574"} Oct 03 10:01:41 crc kubenswrapper[4990]: I1003 10:01:41.132366 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086"} Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.147426 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerStarted","Data":"5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c"} Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.151536 4990 generic.go:334] "Generic (PLEG): container finished" podID="b48916be-afdb-47ca-8eed-d7ad817883b3" containerID="fd9e74c91e6a08463958110965cb2cda9b429bda7c5d992f466ea00d8861ed7b" exitCode=0 Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.151597 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4h7gg" event={"ID":"b48916be-afdb-47ca-8eed-d7ad817883b3","Type":"ContainerDied","Data":"fd9e74c91e6a08463958110965cb2cda9b429bda7c5d992f466ea00d8861ed7b"} Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.156838 4990 generic.go:334] "Generic (PLEG): container finished" podID="edcff799-1dbf-4115-9f05-3e5164b331ad" containerID="605055cfc80d36f87f85bf950602b2ae0f9b1559ba2ee7e06b749c78a8b9e130" exitCode=0 Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.156887 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vc47v" event={"ID":"edcff799-1dbf-4115-9f05-3e5164b331ad","Type":"ContainerDied","Data":"605055cfc80d36f87f85bf950602b2ae0f9b1559ba2ee7e06b749c78a8b9e130"} Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.183623 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=41.466776073 podStartE2EDuration="52.18360564s" podCreationTimestamp="2025-10-03 10:00:50 +0000 UTC" firstStartedPulling="2025-10-03 10:01:29.156195424 +0000 UTC m=+1070.952827281" lastFinishedPulling="2025-10-03 10:01:39.873024991 +0000 UTC m=+1081.669656848" observedRunningTime="2025-10-03 10:01:42.182964714 +0000 UTC m=+1083.979596611" watchObservedRunningTime="2025-10-03 10:01:42.18360564 +0000 UTC m=+1083.980237497" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.455927 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c4796f76f-f4m7n"] Oct 03 10:01:42 crc kubenswrapper[4990]: E1003 10:01:42.456764 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56254a4a-a677-4674-9740-f9da26cef2e4" containerName="ovn-config" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.456856 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="56254a4a-a677-4674-9740-f9da26cef2e4" containerName="ovn-config" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.457146 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="56254a4a-a677-4674-9740-f9da26cef2e4" containerName="ovn-config" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.458378 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.461027 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.481753 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4796f76f-f4m7n"] Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.498761 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-svc\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.498817 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-swift-storage-0\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.498880 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-nb\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.498905 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-sb\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.498925 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9lsq\" (UniqueName: \"kubernetes.io/projected/8073a358-c08a-4bc3-b147-77e1a60dd54c-kube-api-access-v9lsq\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.498964 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-config\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.600132 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-config\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.600582 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-svc\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.600643 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-swift-storage-0\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.600692 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-nb\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.600716 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-sb\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.600736 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9lsq\" (UniqueName: \"kubernetes.io/projected/8073a358-c08a-4bc3-b147-77e1a60dd54c-kube-api-access-v9lsq\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.601056 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-config\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.601114 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-svc\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.601619 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-swift-storage-0\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.601795 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-sb\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.602094 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-nb\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.620173 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9lsq\" (UniqueName: \"kubernetes.io/projected/8073a358-c08a-4bc3-b147-77e1a60dd54c-kube-api-access-v9lsq\") pod \"dnsmasq-dns-7c4796f76f-f4m7n\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.780582 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:42 crc kubenswrapper[4990]: I1003 10:01:42.891956 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56254a4a-a677-4674-9740-f9da26cef2e4" path="/var/lib/kubelet/pods/56254a4a-a677-4674-9740-f9da26cef2e4/volumes" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.268549 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4796f76f-f4m7n"] Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.391952 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.511880 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-config-data\") pod \"edcff799-1dbf-4115-9f05-3e5164b331ad\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.512286 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-combined-ca-bundle\") pod \"edcff799-1dbf-4115-9f05-3e5164b331ad\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.512784 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfjpg\" (UniqueName: \"kubernetes.io/projected/edcff799-1dbf-4115-9f05-3e5164b331ad-kube-api-access-tfjpg\") pod \"edcff799-1dbf-4115-9f05-3e5164b331ad\" (UID: \"edcff799-1dbf-4115-9f05-3e5164b331ad\") " Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.518390 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edcff799-1dbf-4115-9f05-3e5164b331ad-kube-api-access-tfjpg" (OuterVolumeSpecName: "kube-api-access-tfjpg") pod "edcff799-1dbf-4115-9f05-3e5164b331ad" (UID: "edcff799-1dbf-4115-9f05-3e5164b331ad"). InnerVolumeSpecName "kube-api-access-tfjpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.537634 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.540996 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edcff799-1dbf-4115-9f05-3e5164b331ad" (UID: "edcff799-1dbf-4115-9f05-3e5164b331ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.569232 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-config-data" (OuterVolumeSpecName: "config-data") pod "edcff799-1dbf-4115-9f05-3e5164b331ad" (UID: "edcff799-1dbf-4115-9f05-3e5164b331ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.615069 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.615101 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfjpg\" (UniqueName: \"kubernetes.io/projected/edcff799-1dbf-4115-9f05-3e5164b331ad-kube-api-access-tfjpg\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.615114 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edcff799-1dbf-4115-9f05-3e5164b331ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.715891 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-combined-ca-bundle\") pod \"b48916be-afdb-47ca-8eed-d7ad817883b3\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.716026 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4grm2\" (UniqueName: \"kubernetes.io/projected/b48916be-afdb-47ca-8eed-d7ad817883b3-kube-api-access-4grm2\") pod \"b48916be-afdb-47ca-8eed-d7ad817883b3\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.716077 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-config-data\") pod \"b48916be-afdb-47ca-8eed-d7ad817883b3\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.716149 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-db-sync-config-data\") pod \"b48916be-afdb-47ca-8eed-d7ad817883b3\" (UID: \"b48916be-afdb-47ca-8eed-d7ad817883b3\") " Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.720523 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48916be-afdb-47ca-8eed-d7ad817883b3-kube-api-access-4grm2" (OuterVolumeSpecName: "kube-api-access-4grm2") pod "b48916be-afdb-47ca-8eed-d7ad817883b3" (UID: "b48916be-afdb-47ca-8eed-d7ad817883b3"). InnerVolumeSpecName "kube-api-access-4grm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.720737 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b48916be-afdb-47ca-8eed-d7ad817883b3" (UID: "b48916be-afdb-47ca-8eed-d7ad817883b3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.738140 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b48916be-afdb-47ca-8eed-d7ad817883b3" (UID: "b48916be-afdb-47ca-8eed-d7ad817883b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.757578 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-config-data" (OuterVolumeSpecName: "config-data") pod "b48916be-afdb-47ca-8eed-d7ad817883b3" (UID: "b48916be-afdb-47ca-8eed-d7ad817883b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.817598 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4grm2\" (UniqueName: \"kubernetes.io/projected/b48916be-afdb-47ca-8eed-d7ad817883b3-kube-api-access-4grm2\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.817634 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.817648 4990 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:43 crc kubenswrapper[4990]: I1003 10:01:43.817659 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48916be-afdb-47ca-8eed-d7ad817883b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.173315 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4h7gg" event={"ID":"b48916be-afdb-47ca-8eed-d7ad817883b3","Type":"ContainerDied","Data":"334902d3b927a6863c28d45693d5f9c82d8d0e90a50fb639a9e1597b9287ec44"} Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.173680 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334902d3b927a6863c28d45693d5f9c82d8d0e90a50fb639a9e1597b9287ec44" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.173750 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4h7gg" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.176762 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vc47v" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.176928 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vc47v" event={"ID":"edcff799-1dbf-4115-9f05-3e5164b331ad","Type":"ContainerDied","Data":"0ca36ff5eaba8c95cfd58516ef630b75bb1e4f4b362f456079ef80a59b29c0cd"} Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.177031 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca36ff5eaba8c95cfd58516ef630b75bb1e4f4b362f456079ef80a59b29c0cd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.179073 4990 generic.go:334] "Generic (PLEG): container finished" podID="8073a358-c08a-4bc3-b147-77e1a60dd54c" containerID="10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97" exitCode=0 Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.179214 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" event={"ID":"8073a358-c08a-4bc3-b147-77e1a60dd54c","Type":"ContainerDied","Data":"10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97"} Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.179251 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" event={"ID":"8073a358-c08a-4bc3-b147-77e1a60dd54c","Type":"ContainerStarted","Data":"33a8e3add34fc7dca51f198297369d62394d9d9bcaa6c67a4dcc8dc0e21e8f85"} Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.496438 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4796f76f-f4m7n"] Oct 03 10:01:44 crc kubenswrapper[4990]: E1003 10:01:44.500361 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedcff799_1dbf_4115_9f05_3e5164b331ad.slice/crio-0ca36ff5eaba8c95cfd58516ef630b75bb1e4f4b362f456079ef80a59b29c0cd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb48916be_afdb_47ca_8eed_d7ad817883b3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb48916be_afdb_47ca_8eed_d7ad817883b3.slice/crio-334902d3b927a6863c28d45693d5f9c82d8d0e90a50fb639a9e1597b9287ec44\": RecentStats: unable to find data in memory cache]" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.537768 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9"] Oct 03 10:01:44 crc kubenswrapper[4990]: E1003 10:01:44.538241 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48916be-afdb-47ca-8eed-d7ad817883b3" containerName="glance-db-sync" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.538261 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48916be-afdb-47ca-8eed-d7ad817883b3" containerName="glance-db-sync" Oct 03 10:01:44 crc kubenswrapper[4990]: E1003 10:01:44.538286 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcff799-1dbf-4115-9f05-3e5164b331ad" containerName="keystone-db-sync" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.538293 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcff799-1dbf-4115-9f05-3e5164b331ad" containerName="keystone-db-sync" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.538472 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="edcff799-1dbf-4115-9f05-3e5164b331ad" containerName="keystone-db-sync" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.538487 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48916be-afdb-47ca-8eed-d7ad817883b3" containerName="glance-db-sync" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.539579 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.572698 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9"] Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.591475 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-d264n"] Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.592961 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.596061 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kk8mw" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.596320 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.596538 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.596675 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.610026 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d264n"] Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.650112 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-config\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.650952 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4tlv\" (UniqueName: \"kubernetes.io/projected/471e10f0-b793-44c3-9f14-56b94f2b52a7-kube-api-access-s4tlv\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.650992 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-nb\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.651050 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-sb\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.651080 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-svc\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.651122 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-swift-storage-0\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.712078 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9"] Oct 03 10:01:44 crc kubenswrapper[4990]: E1003 10:01:44.713418 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-s4tlv ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" podUID="471e10f0-b793-44c3-9f14-56b94f2b52a7" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752012 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-sb\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752070 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-fernet-keys\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752103 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-svc\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752125 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-config-data\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752153 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-scripts\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752178 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-combined-ca-bundle\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752206 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5xc\" (UniqueName: \"kubernetes.io/projected/4a09106e-59ee-4666-940e-7ffeaab8f83f-kube-api-access-cf5xc\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752244 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-swift-storage-0\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752290 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-credential-keys\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752317 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-config\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752359 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4tlv\" (UniqueName: \"kubernetes.io/projected/471e10f0-b793-44c3-9f14-56b94f2b52a7-kube-api-access-s4tlv\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.752383 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-nb\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.753494 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-sb\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.753901 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-nb\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.753919 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-svc\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.754637 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-config\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.755414 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-swift-storage-0\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.773280 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d56678497-swgzd"] Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.774915 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.789220 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4tlv\" (UniqueName: \"kubernetes.io/projected/471e10f0-b793-44c3-9f14-56b94f2b52a7-kube-api-access-s4tlv\") pod \"dnsmasq-dns-6f4f6fdb8f-lpxx9\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.791692 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d56678497-swgzd"] Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.835307 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.837856 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.850547 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.850722 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.856969 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-config-data\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857054 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-scripts\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857113 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-combined-ca-bundle\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857147 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5xc\" (UniqueName: \"kubernetes.io/projected/4a09106e-59ee-4666-940e-7ffeaab8f83f-kube-api-access-cf5xc\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857233 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857308 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-config-data\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857383 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-run-httpd\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857418 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-credential-keys\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857478 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-nb\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857632 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-swift-storage-0\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857670 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcq7v\" (UniqueName: \"kubernetes.io/projected/9cf03a9d-f212-4cf4-b60f-7c3086446697-kube-api-access-qcq7v\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857770 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857808 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrfph\" (UniqueName: \"kubernetes.io/projected/3e8d0256-9c01-46fc-92e3-2e4e87709158-kube-api-access-zrfph\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857845 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-sb\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857876 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-config\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857899 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-scripts\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857936 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-svc\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.857963 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-log-httpd\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.858007 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-fernet-keys\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.866096 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-scripts\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.868100 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-fernet-keys\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.873039 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-credential-keys\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.875472 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-config-data\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.891079 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.907405 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-combined-ca-bundle\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.919682 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5xc\" (UniqueName: \"kubernetes.io/projected/4a09106e-59ee-4666-940e-7ffeaab8f83f-kube-api-access-cf5xc\") pod \"keystone-bootstrap-d264n\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963452 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-nb\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963541 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-swift-storage-0\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963563 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcq7v\" (UniqueName: \"kubernetes.io/projected/9cf03a9d-f212-4cf4-b60f-7c3086446697-kube-api-access-qcq7v\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963590 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963609 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrfph\" (UniqueName: \"kubernetes.io/projected/3e8d0256-9c01-46fc-92e3-2e4e87709158-kube-api-access-zrfph\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963638 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-sb\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963660 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-config\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963677 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-scripts\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963699 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-svc\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963715 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-log-httpd\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963764 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963787 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-config-data\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.963816 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-run-httpd\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.964720 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-run-httpd\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.965108 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-log-httpd\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.965771 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-sb\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.965894 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-config\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.971073 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-swift-storage-0\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.971636 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-nb\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.971786 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-svc\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.981668 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.981801 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.982787 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-config-data\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:44 crc kubenswrapper[4990]: I1003 10:01:44.986095 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-scripts\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.035805 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcq7v\" (UniqueName: \"kubernetes.io/projected/9cf03a9d-f212-4cf4-b60f-7c3086446697-kube-api-access-qcq7v\") pod \"dnsmasq-dns-7d56678497-swgzd\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.044987 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrfph\" (UniqueName: \"kubernetes.io/projected/3e8d0256-9c01-46fc-92e3-2e4e87709158-kube-api-access-zrfph\") pod \"ceilometer-0\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " pod="openstack/ceilometer-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.079889 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ckmxp"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.081260 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.095462 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.110084 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ckmxp"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.113293 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.113762 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-v92hc" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.127942 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d56678497-swgzd"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.128850 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.186134 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78df67ddff-2zbr2"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.199474 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.213839 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.235567 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78df67ddff-2zbr2"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.272887 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a19e1-8b84-48a7-8dde-f22078695aa9-logs\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.272956 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-scripts\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.273063 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-combined-ca-bundle\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.273109 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cnl2\" (UniqueName: \"kubernetes.io/projected/5c8a19e1-8b84-48a7-8dde-f22078695aa9-kube-api-access-5cnl2\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.273146 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-config-data\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.276910 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" event={"ID":"8073a358-c08a-4bc3-b147-77e1a60dd54c","Type":"ContainerStarted","Data":"3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b"} Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.276981 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.277034 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.289723 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.313103 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.374258 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-swift-storage-0\") pod \"471e10f0-b793-44c3-9f14-56b94f2b52a7\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.374358 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-svc\") pod \"471e10f0-b793-44c3-9f14-56b94f2b52a7\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.374456 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-sb\") pod \"471e10f0-b793-44c3-9f14-56b94f2b52a7\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.374487 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4tlv\" (UniqueName: \"kubernetes.io/projected/471e10f0-b793-44c3-9f14-56b94f2b52a7-kube-api-access-s4tlv\") pod \"471e10f0-b793-44c3-9f14-56b94f2b52a7\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.374558 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-nb\") pod \"471e10f0-b793-44c3-9f14-56b94f2b52a7\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.374614 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-config\") pod \"471e10f0-b793-44c3-9f14-56b94f2b52a7\" (UID: \"471e10f0-b793-44c3-9f14-56b94f2b52a7\") " Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375057 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-swift-storage-0\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375088 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-combined-ca-bundle\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375142 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cnl2\" (UniqueName: \"kubernetes.io/projected/5c8a19e1-8b84-48a7-8dde-f22078695aa9-kube-api-access-5cnl2\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375163 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-sb\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375186 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-config\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375220 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-config-data\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375294 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-nb\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375396 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvssv\" (UniqueName: \"kubernetes.io/projected/a06fc0e3-65b2-42fc-abac-e1774866d250-kube-api-access-tvssv\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375421 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a19e1-8b84-48a7-8dde-f22078695aa9-logs\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375444 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-svc\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.375465 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-scripts\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.376200 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "471e10f0-b793-44c3-9f14-56b94f2b52a7" (UID: "471e10f0-b793-44c3-9f14-56b94f2b52a7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.376653 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "471e10f0-b793-44c3-9f14-56b94f2b52a7" (UID: "471e10f0-b793-44c3-9f14-56b94f2b52a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.376949 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "471e10f0-b793-44c3-9f14-56b94f2b52a7" (UID: "471e10f0-b793-44c3-9f14-56b94f2b52a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.378219 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-config" (OuterVolumeSpecName: "config") pod "471e10f0-b793-44c3-9f14-56b94f2b52a7" (UID: "471e10f0-b793-44c3-9f14-56b94f2b52a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.378684 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "471e10f0-b793-44c3-9f14-56b94f2b52a7" (UID: "471e10f0-b793-44c3-9f14-56b94f2b52a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.381783 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a19e1-8b84-48a7-8dde-f22078695aa9-logs\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.382741 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" podStartSLOduration=3.38272465 podStartE2EDuration="3.38272465s" podCreationTimestamp="2025-10-03 10:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:45.333927513 +0000 UTC m=+1087.130559380" watchObservedRunningTime="2025-10-03 10:01:45.38272465 +0000 UTC m=+1087.179356517" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.382173 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/471e10f0-b793-44c3-9f14-56b94f2b52a7-kube-api-access-s4tlv" (OuterVolumeSpecName: "kube-api-access-s4tlv") pod "471e10f0-b793-44c3-9f14-56b94f2b52a7" (UID: "471e10f0-b793-44c3-9f14-56b94f2b52a7"). InnerVolumeSpecName "kube-api-access-s4tlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.385080 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-combined-ca-bundle\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.385956 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-scripts\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.409582 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-config-data\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.416392 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cnl2\" (UniqueName: \"kubernetes.io/projected/5c8a19e1-8b84-48a7-8dde-f22078695aa9-kube-api-access-5cnl2\") pod \"placement-db-sync-ckmxp\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.433407 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-dff5-account-create-cf9kx"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.434567 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dff5-account-create-cf9kx" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.440440 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dff5-account-create-cf9kx"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.442696 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.450213 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ckmxp" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.477048 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-sb\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.477118 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-config\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.477155 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-nb\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.477231 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvssv\" (UniqueName: \"kubernetes.io/projected/a06fc0e3-65b2-42fc-abac-e1774866d250-kube-api-access-tvssv\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.477259 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-svc\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.477342 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-swift-storage-0\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.477935 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-sb\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.480592 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.480644 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.480658 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.480669 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.480680 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4tlv\" (UniqueName: \"kubernetes.io/projected/471e10f0-b793-44c3-9f14-56b94f2b52a7-kube-api-access-s4tlv\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.480689 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/471e10f0-b793-44c3-9f14-56b94f2b52a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.480759 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-swift-storage-0\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.481081 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-config\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.481438 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-svc\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.482634 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-nb\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.509992 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvssv\" (UniqueName: \"kubernetes.io/projected/a06fc0e3-65b2-42fc-abac-e1774866d250-kube-api-access-tvssv\") pod \"dnsmasq-dns-78df67ddff-2zbr2\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.546301 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bb2c-account-create-cxvp9"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.547421 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bb2c-account-create-cxvp9" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.549409 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.557804 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bb2c-account-create-cxvp9"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.576456 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.584253 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59llt\" (UniqueName: \"kubernetes.io/projected/b45912f0-ed72-4751-9543-bacd041baba0-kube-api-access-59llt\") pod \"cinder-dff5-account-create-cf9kx\" (UID: \"b45912f0-ed72-4751-9543-bacd041baba0\") " pod="openstack/cinder-dff5-account-create-cf9kx" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.628315 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b28-account-create-cpqx4"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.629709 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b28-account-create-cpqx4" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.636657 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.648951 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b28-account-create-cpqx4"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.686753 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59llt\" (UniqueName: \"kubernetes.io/projected/b45912f0-ed72-4751-9543-bacd041baba0-kube-api-access-59llt\") pod \"cinder-dff5-account-create-cf9kx\" (UID: \"b45912f0-ed72-4751-9543-bacd041baba0\") " pod="openstack/cinder-dff5-account-create-cf9kx" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.689544 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5qc\" (UniqueName: \"kubernetes.io/projected/3609b86b-bac8-4a2b-a96c-9e8a317deecf-kube-api-access-lc5qc\") pod \"barbican-bb2c-account-create-cxvp9\" (UID: \"3609b86b-bac8-4a2b-a96c-9e8a317deecf\") " pod="openstack/barbican-bb2c-account-create-cxvp9" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.707346 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59llt\" (UniqueName: \"kubernetes.io/projected/b45912f0-ed72-4751-9543-bacd041baba0-kube-api-access-59llt\") pod \"cinder-dff5-account-create-cf9kx\" (UID: \"b45912f0-ed72-4751-9543-bacd041baba0\") " pod="openstack/cinder-dff5-account-create-cf9kx" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.738837 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.740654 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.744759 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.744968 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.745099 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bw7f6" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.764890 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.768800 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dff5-account-create-cf9kx" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.795310 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5qc\" (UniqueName: \"kubernetes.io/projected/3609b86b-bac8-4a2b-a96c-9e8a317deecf-kube-api-access-lc5qc\") pod \"barbican-bb2c-account-create-cxvp9\" (UID: \"3609b86b-bac8-4a2b-a96c-9e8a317deecf\") " pod="openstack/barbican-bb2c-account-create-cxvp9" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.795441 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vbc5\" (UniqueName: \"kubernetes.io/projected/87ca1241-a022-4b2a-988f-79e2fd42e6aa-kube-api-access-9vbc5\") pod \"neutron-7b28-account-create-cpqx4\" (UID: \"87ca1241-a022-4b2a-988f-79e2fd42e6aa\") " pod="openstack/neutron-7b28-account-create-cpqx4" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.824030 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5qc\" (UniqueName: \"kubernetes.io/projected/3609b86b-bac8-4a2b-a96c-9e8a317deecf-kube-api-access-lc5qc\") pod \"barbican-bb2c-account-create-cxvp9\" (UID: \"3609b86b-bac8-4a2b-a96c-9e8a317deecf\") " pod="openstack/barbican-bb2c-account-create-cxvp9" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.838320 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d56678497-swgzd"] Oct 03 10:01:45 crc kubenswrapper[4990]: W1003 10:01:45.855471 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cf03a9d_f212_4cf4_b60f_7c3086446697.slice/crio-7e5ddd360a2952a595dc29bd05e413334cf360ebb0b769e9f268274ea643aaeb WatchSource:0}: Error finding container 7e5ddd360a2952a595dc29bd05e413334cf360ebb0b769e9f268274ea643aaeb: Status 404 returned error can't find the container with id 7e5ddd360a2952a595dc29bd05e413334cf360ebb0b769e9f268274ea643aaeb Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.877990 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bb2c-account-create-cxvp9" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.911973 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzdxw\" (UniqueName: \"kubernetes.io/projected/c75ce736-6adb-4999-9db0-92afd4c874a2-kube-api-access-lzdxw\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.912446 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.912617 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.912726 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vbc5\" (UniqueName: \"kubernetes.io/projected/87ca1241-a022-4b2a-988f-79e2fd42e6aa-kube-api-access-9vbc5\") pod \"neutron-7b28-account-create-cpqx4\" (UID: \"87ca1241-a022-4b2a-988f-79e2fd42e6aa\") " pod="openstack/neutron-7b28-account-create-cpqx4" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.912787 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.913412 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.913542 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-logs\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.913620 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.941427 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vbc5\" (UniqueName: \"kubernetes.io/projected/87ca1241-a022-4b2a-988f-79e2fd42e6aa-kube-api-access-9vbc5\") pod \"neutron-7b28-account-create-cpqx4\" (UID: \"87ca1241-a022-4b2a-988f-79e2fd42e6aa\") " pod="openstack/neutron-7b28-account-create-cpqx4" Oct 03 10:01:45 crc kubenswrapper[4990]: I1003 10:01:45.981336 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d264n"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.012865 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.014866 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.017211 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.019075 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.021271 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.021377 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.021436 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.021502 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-logs\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.021557 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.021657 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzdxw\" (UniqueName: \"kubernetes.io/projected/c75ce736-6adb-4999-9db0-92afd4c874a2-kube-api-access-lzdxw\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.021748 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.024136 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.026822 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.026835 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.027090 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-logs\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.030234 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: W1003 10:01:46.039823 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a09106e_59ee_4666_940e_7ffeaab8f83f.slice/crio-3bbddb07063ac561ffe6d788854a762f84ad40467c8428c04910573735137069 WatchSource:0}: Error finding container 3bbddb07063ac561ffe6d788854a762f84ad40467c8428c04910573735137069: Status 404 returned error can't find the container with id 3bbddb07063ac561ffe6d788854a762f84ad40467c8428c04910573735137069 Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.051285 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.060238 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.073376 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzdxw\" (UniqueName: \"kubernetes.io/projected/c75ce736-6adb-4999-9db0-92afd4c874a2-kube-api-access-lzdxw\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.087178 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b28-account-create-cpqx4" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.091813 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.122927 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.122991 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-logs\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.123142 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.123186 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.123293 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.123373 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.123681 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lxbk\" (UniqueName: \"kubernetes.io/projected/e83fd229-408a-4e17-8af6-36374c507013-kube-api-access-8lxbk\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.165194 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dff5-account-create-cf9kx"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.184678 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78df67ddff-2zbr2"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.219006 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ckmxp"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.225819 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lxbk\" (UniqueName: \"kubernetes.io/projected/e83fd229-408a-4e17-8af6-36374c507013-kube-api-access-8lxbk\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.225944 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.226013 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-logs\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.226054 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.226078 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.226116 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.226148 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.226648 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.226736 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-logs\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.227336 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.236712 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.242104 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.244555 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.245284 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.247845 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lxbk\" (UniqueName: \"kubernetes.io/projected/e83fd229-408a-4e17-8af6-36374c507013-kube-api-access-8lxbk\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.284815 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.312002 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dff5-account-create-cf9kx" event={"ID":"b45912f0-ed72-4751-9543-bacd041baba0","Type":"ContainerStarted","Data":"d60095bfbad833da13cdd2fbe6eb91dc63919d125917715b4fe61ac239412aea"} Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.325336 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerStarted","Data":"fee84227d4a74583c3cb012766f280ee20445d563e9965efb9a18217f3b966b2"} Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.334706 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" event={"ID":"a06fc0e3-65b2-42fc-abac-e1774866d250","Type":"ContainerStarted","Data":"3c2decd74c81bfffbdebd8aa3dc6d2d51585be43cdb4c19e16f6c5d4d6cc828e"} Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.336775 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d264n" event={"ID":"4a09106e-59ee-4666-940e-7ffeaab8f83f","Type":"ContainerStarted","Data":"3bbddb07063ac561ffe6d788854a762f84ad40467c8428c04910573735137069"} Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.338494 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ckmxp" event={"ID":"5c8a19e1-8b84-48a7-8dde-f22078695aa9","Type":"ContainerStarted","Data":"ba8f3c06dee832a2e6c10bf03950f9ba4f5e29e9dfc5e4cecfa817aa8289e609"} Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.345140 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.346623 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d56678497-swgzd" event={"ID":"9cf03a9d-f212-4cf4-b60f-7c3086446697","Type":"ContainerStarted","Data":"7e5ddd360a2952a595dc29bd05e413334cf360ebb0b769e9f268274ea643aaeb"} Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.346650 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d56678497-swgzd" podUID="9cf03a9d-f212-4cf4-b60f-7c3086446697" containerName="init" containerID="cri-o://3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379" gracePeriod=10 Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.346839 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" podUID="8073a358-c08a-4bc3-b147-77e1a60dd54c" containerName="dnsmasq-dns" containerID="cri-o://3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b" gracePeriod=10 Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.356571 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.429723 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bb2c-account-create-cxvp9"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.522558 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.533051 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f4f6fdb8f-lpxx9"] Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.724813 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b28-account-create-cpqx4"] Oct 03 10:01:46 crc kubenswrapper[4990]: W1003 10:01:46.727862 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87ca1241_a022_4b2a_988f_79e2fd42e6aa.slice/crio-12dead751e579a246941f1b69032cdf0201483186ea2369c606921817caba29e WatchSource:0}: Error finding container 12dead751e579a246941f1b69032cdf0201483186ea2369c606921817caba29e: Status 404 returned error can't find the container with id 12dead751e579a246941f1b69032cdf0201483186ea2369c606921817caba29e Oct 03 10:01:46 crc kubenswrapper[4990]: I1003 10:01:46.884055 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="471e10f0-b793-44c3-9f14-56b94f2b52a7" path="/var/lib/kubelet/pods/471e10f0-b793-44c3-9f14-56b94f2b52a7/volumes" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.077368 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.105129 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.115747 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.151962 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.223769 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257496 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-sb\") pod \"8073a358-c08a-4bc3-b147-77e1a60dd54c\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257654 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-swift-storage-0\") pod \"8073a358-c08a-4bc3-b147-77e1a60dd54c\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257708 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9lsq\" (UniqueName: \"kubernetes.io/projected/8073a358-c08a-4bc3-b147-77e1a60dd54c-kube-api-access-v9lsq\") pod \"8073a358-c08a-4bc3-b147-77e1a60dd54c\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257736 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcq7v\" (UniqueName: \"kubernetes.io/projected/9cf03a9d-f212-4cf4-b60f-7c3086446697-kube-api-access-qcq7v\") pod \"9cf03a9d-f212-4cf4-b60f-7c3086446697\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257770 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-swift-storage-0\") pod \"9cf03a9d-f212-4cf4-b60f-7c3086446697\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257794 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-sb\") pod \"9cf03a9d-f212-4cf4-b60f-7c3086446697\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257841 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-nb\") pod \"9cf03a9d-f212-4cf4-b60f-7c3086446697\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257866 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-config\") pod \"8073a358-c08a-4bc3-b147-77e1a60dd54c\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257909 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-config\") pod \"9cf03a9d-f212-4cf4-b60f-7c3086446697\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257939 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-nb\") pod \"8073a358-c08a-4bc3-b147-77e1a60dd54c\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257962 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-svc\") pod \"8073a358-c08a-4bc3-b147-77e1a60dd54c\" (UID: \"8073a358-c08a-4bc3-b147-77e1a60dd54c\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.257995 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-svc\") pod \"9cf03a9d-f212-4cf4-b60f-7c3086446697\" (UID: \"9cf03a9d-f212-4cf4-b60f-7c3086446697\") " Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.288577 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.305737 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8073a358-c08a-4bc3-b147-77e1a60dd54c-kube-api-access-v9lsq" (OuterVolumeSpecName: "kube-api-access-v9lsq") pod "8073a358-c08a-4bc3-b147-77e1a60dd54c" (UID: "8073a358-c08a-4bc3-b147-77e1a60dd54c"). InnerVolumeSpecName "kube-api-access-v9lsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.305847 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf03a9d-f212-4cf4-b60f-7c3086446697-kube-api-access-qcq7v" (OuterVolumeSpecName: "kube-api-access-qcq7v") pod "9cf03a9d-f212-4cf4-b60f-7c3086446697" (UID: "9cf03a9d-f212-4cf4-b60f-7c3086446697"). InnerVolumeSpecName "kube-api-access-qcq7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.317209 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9cf03a9d-f212-4cf4-b60f-7c3086446697" (UID: "9cf03a9d-f212-4cf4-b60f-7c3086446697"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.318788 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-config" (OuterVolumeSpecName: "config") pod "9cf03a9d-f212-4cf4-b60f-7c3086446697" (UID: "9cf03a9d-f212-4cf4-b60f-7c3086446697"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.326888 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9cf03a9d-f212-4cf4-b60f-7c3086446697" (UID: "9cf03a9d-f212-4cf4-b60f-7c3086446697"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.357881 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9cf03a9d-f212-4cf4-b60f-7c3086446697" (UID: "9cf03a9d-f212-4cf4-b60f-7c3086446697"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.364589 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.364619 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9lsq\" (UniqueName: \"kubernetes.io/projected/8073a358-c08a-4bc3-b147-77e1a60dd54c-kube-api-access-v9lsq\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.364629 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcq7v\" (UniqueName: \"kubernetes.io/projected/9cf03a9d-f212-4cf4-b60f-7c3086446697-kube-api-access-qcq7v\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.364642 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.364650 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.364659 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.368157 4990 generic.go:334] "Generic (PLEG): container finished" podID="8073a358-c08a-4bc3-b147-77e1a60dd54c" containerID="3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b" exitCode=0 Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.368291 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.368636 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" event={"ID":"8073a358-c08a-4bc3-b147-77e1a60dd54c","Type":"ContainerDied","Data":"3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.369142 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4796f76f-f4m7n" event={"ID":"8073a358-c08a-4bc3-b147-77e1a60dd54c","Type":"ContainerDied","Data":"33a8e3add34fc7dca51f198297369d62394d9d9bcaa6c67a4dcc8dc0e21e8f85"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.369168 4990 scope.go:117] "RemoveContainer" containerID="3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.376486 4990 generic.go:334] "Generic (PLEG): container finished" podID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerID="77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3" exitCode=0 Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.376611 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" event={"ID":"a06fc0e3-65b2-42fc-abac-e1774866d250","Type":"ContainerDied","Data":"77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.378794 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9cf03a9d-f212-4cf4-b60f-7c3086446697" (UID: "9cf03a9d-f212-4cf4-b60f-7c3086446697"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.382519 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d264n" event={"ID":"4a09106e-59ee-4666-940e-7ffeaab8f83f","Type":"ContainerStarted","Data":"ef11333bf94355cba01475b1c22a8097568481d7e330f677fc3d3c2369b3bdc4"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.399048 4990 generic.go:334] "Generic (PLEG): container finished" podID="9cf03a9d-f212-4cf4-b60f-7c3086446697" containerID="3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379" exitCode=0 Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.399136 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d56678497-swgzd" event={"ID":"9cf03a9d-f212-4cf4-b60f-7c3086446697","Type":"ContainerDied","Data":"3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.399164 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d56678497-swgzd" event={"ID":"9cf03a9d-f212-4cf4-b60f-7c3086446697","Type":"ContainerDied","Data":"7e5ddd360a2952a595dc29bd05e413334cf360ebb0b769e9f268274ea643aaeb"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.399227 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d56678497-swgzd" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.411869 4990 scope.go:117] "RemoveContainer" containerID="10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.417762 4990 generic.go:334] "Generic (PLEG): container finished" podID="3609b86b-bac8-4a2b-a96c-9e8a317deecf" containerID="6bcce0bfbc646707b7148f2d86f412c6ad5b8f72cdfc3ed1fe2a9dd30a9f075f" exitCode=0 Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.417851 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bb2c-account-create-cxvp9" event={"ID":"3609b86b-bac8-4a2b-a96c-9e8a317deecf","Type":"ContainerDied","Data":"6bcce0bfbc646707b7148f2d86f412c6ad5b8f72cdfc3ed1fe2a9dd30a9f075f"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.417877 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bb2c-account-create-cxvp9" event={"ID":"3609b86b-bac8-4a2b-a96c-9e8a317deecf","Type":"ContainerStarted","Data":"07b83f86f7231a11988bc0d8ac3fa28c65841082dec2870d03f3c7eed724322f"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.427209 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c75ce736-6adb-4999-9db0-92afd4c874a2","Type":"ContainerStarted","Data":"12dfb8d9e8944b1ce6bdc7d07fea2fcf98c57dcde6b43da896efddc46e0c3dc0"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.429726 4990 generic.go:334] "Generic (PLEG): container finished" podID="b45912f0-ed72-4751-9543-bacd041baba0" containerID="f9db0b13dd512f4e09ae656d0b18ac85d17dbf8d25dbbc39405095596f9d32d0" exitCode=0 Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.429773 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dff5-account-create-cf9kx" event={"ID":"b45912f0-ed72-4751-9543-bacd041baba0","Type":"ContainerDied","Data":"f9db0b13dd512f4e09ae656d0b18ac85d17dbf8d25dbbc39405095596f9d32d0"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.432194 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b28-account-create-cpqx4" event={"ID":"87ca1241-a022-4b2a-988f-79e2fd42e6aa","Type":"ContainerStarted","Data":"b36b62113bc8b0a5153619c9ce15a20111b4cef90bbb719726c4d653486f1bd1"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.432224 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b28-account-create-cpqx4" event={"ID":"87ca1241-a022-4b2a-988f-79e2fd42e6aa","Type":"ContainerStarted","Data":"12dead751e579a246941f1b69032cdf0201483186ea2369c606921817caba29e"} Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.436067 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8073a358-c08a-4bc3-b147-77e1a60dd54c" (UID: "8073a358-c08a-4bc3-b147-77e1a60dd54c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.443617 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-config" (OuterVolumeSpecName: "config") pod "8073a358-c08a-4bc3-b147-77e1a60dd54c" (UID: "8073a358-c08a-4bc3-b147-77e1a60dd54c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.452161 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8073a358-c08a-4bc3-b147-77e1a60dd54c" (UID: "8073a358-c08a-4bc3-b147-77e1a60dd54c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.454219 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-d264n" podStartSLOduration=3.454182453 podStartE2EDuration="3.454182453s" podCreationTimestamp="2025-10-03 10:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:47.423174203 +0000 UTC m=+1089.219806060" watchObservedRunningTime="2025-10-03 10:01:47.454182453 +0000 UTC m=+1089.250814310" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.455167 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8073a358-c08a-4bc3-b147-77e1a60dd54c" (UID: "8073a358-c08a-4bc3-b147-77e1a60dd54c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.456259 4990 scope.go:117] "RemoveContainer" containerID="3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b" Oct 03 10:01:47 crc kubenswrapper[4990]: E1003 10:01:47.456680 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b\": container with ID starting with 3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b not found: ID does not exist" containerID="3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.456745 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b"} err="failed to get container status \"3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b\": rpc error: code = NotFound desc = could not find container \"3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b\": container with ID starting with 3aee5308b13d672fc6a1fff3346c8c416536535bdca2ceddcb69c2034a15a38b not found: ID does not exist" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.456770 4990 scope.go:117] "RemoveContainer" containerID="10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97" Oct 03 10:01:47 crc kubenswrapper[4990]: E1003 10:01:47.462074 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97\": container with ID starting with 10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97 not found: ID does not exist" containerID="10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.462132 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97"} err="failed to get container status \"10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97\": rpc error: code = NotFound desc = could not find container \"10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97\": container with ID starting with 10e8e42696f948f4044549d2eabfbf4237ae4c4daa7b7882f728ce875db62d97 not found: ID does not exist" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.462175 4990 scope.go:117] "RemoveContainer" containerID="3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.466824 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.466853 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.466864 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.466873 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9cf03a9d-f212-4cf4-b60f-7c3086446697-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.466882 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.491964 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8073a358-c08a-4bc3-b147-77e1a60dd54c" (UID: "8073a358-c08a-4bc3-b147-77e1a60dd54c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.522270 4990 scope.go:117] "RemoveContainer" containerID="3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379" Oct 03 10:01:47 crc kubenswrapper[4990]: E1003 10:01:47.523010 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379\": container with ID starting with 3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379 not found: ID does not exist" containerID="3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.523043 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379"} err="failed to get container status \"3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379\": rpc error: code = NotFound desc = could not find container \"3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379\": container with ID starting with 3b7d3b10377df618bf6c11dd2d4a61419c4374aeafaee23953a6bd7df1d32379 not found: ID does not exist" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.531645 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d56678497-swgzd"] Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.542160 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d56678497-swgzd"] Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.567942 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8073a358-c08a-4bc3-b147-77e1a60dd54c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.762959 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4796f76f-f4m7n"] Oct 03 10:01:47 crc kubenswrapper[4990]: I1003 10:01:47.777369 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c4796f76f-f4m7n"] Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.010492 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.448342 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" event={"ID":"a06fc0e3-65b2-42fc-abac-e1774866d250","Type":"ContainerStarted","Data":"c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700"} Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.448704 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.454878 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c75ce736-6adb-4999-9db0-92afd4c874a2","Type":"ContainerStarted","Data":"87e659f07a0c49291eeb295e019d8fd0f6487c92fd04fc684400ce1f5a7364c3"} Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.457052 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83fd229-408a-4e17-8af6-36374c507013","Type":"ContainerStarted","Data":"9f22c6ea783bc85f9e334e3b180eaed55240801e6e4f7222612a3d2f4031d05f"} Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.459124 4990 generic.go:334] "Generic (PLEG): container finished" podID="87ca1241-a022-4b2a-988f-79e2fd42e6aa" containerID="b36b62113bc8b0a5153619c9ce15a20111b4cef90bbb719726c4d653486f1bd1" exitCode=0 Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.459183 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b28-account-create-cpqx4" event={"ID":"87ca1241-a022-4b2a-988f-79e2fd42e6aa","Type":"ContainerDied","Data":"b36b62113bc8b0a5153619c9ce15a20111b4cef90bbb719726c4d653486f1bd1"} Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.487378 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" podStartSLOduration=3.487349993 podStartE2EDuration="3.487349993s" podCreationTimestamp="2025-10-03 10:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:48.485408415 +0000 UTC m=+1090.282040282" watchObservedRunningTime="2025-10-03 10:01:48.487349993 +0000 UTC m=+1090.283981850" Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.911361 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8073a358-c08a-4bc3-b147-77e1a60dd54c" path="/var/lib/kubelet/pods/8073a358-c08a-4bc3-b147-77e1a60dd54c/volumes" Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.912713 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf03a9d-f212-4cf4-b60f-7c3086446697" path="/var/lib/kubelet/pods/9cf03a9d-f212-4cf4-b60f-7c3086446697/volumes" Oct 03 10:01:48 crc kubenswrapper[4990]: I1003 10:01:48.988994 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b28-account-create-cpqx4" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.033144 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dff5-account-create-cf9kx" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.057309 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bb2c-account-create-cxvp9" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.104282 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59llt\" (UniqueName: \"kubernetes.io/projected/b45912f0-ed72-4751-9543-bacd041baba0-kube-api-access-59llt\") pod \"b45912f0-ed72-4751-9543-bacd041baba0\" (UID: \"b45912f0-ed72-4751-9543-bacd041baba0\") " Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.104858 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vbc5\" (UniqueName: \"kubernetes.io/projected/87ca1241-a022-4b2a-988f-79e2fd42e6aa-kube-api-access-9vbc5\") pod \"87ca1241-a022-4b2a-988f-79e2fd42e6aa\" (UID: \"87ca1241-a022-4b2a-988f-79e2fd42e6aa\") " Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.109674 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b45912f0-ed72-4751-9543-bacd041baba0-kube-api-access-59llt" (OuterVolumeSpecName: "kube-api-access-59llt") pod "b45912f0-ed72-4751-9543-bacd041baba0" (UID: "b45912f0-ed72-4751-9543-bacd041baba0"). InnerVolumeSpecName "kube-api-access-59llt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.109886 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ca1241-a022-4b2a-988f-79e2fd42e6aa-kube-api-access-9vbc5" (OuterVolumeSpecName: "kube-api-access-9vbc5") pod "87ca1241-a022-4b2a-988f-79e2fd42e6aa" (UID: "87ca1241-a022-4b2a-988f-79e2fd42e6aa"). InnerVolumeSpecName "kube-api-access-9vbc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.206673 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5qc\" (UniqueName: \"kubernetes.io/projected/3609b86b-bac8-4a2b-a96c-9e8a317deecf-kube-api-access-lc5qc\") pod \"3609b86b-bac8-4a2b-a96c-9e8a317deecf\" (UID: \"3609b86b-bac8-4a2b-a96c-9e8a317deecf\") " Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.208213 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vbc5\" (UniqueName: \"kubernetes.io/projected/87ca1241-a022-4b2a-988f-79e2fd42e6aa-kube-api-access-9vbc5\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.208265 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59llt\" (UniqueName: \"kubernetes.io/projected/b45912f0-ed72-4751-9543-bacd041baba0-kube-api-access-59llt\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.210563 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3609b86b-bac8-4a2b-a96c-9e8a317deecf-kube-api-access-lc5qc" (OuterVolumeSpecName: "kube-api-access-lc5qc") pod "3609b86b-bac8-4a2b-a96c-9e8a317deecf" (UID: "3609b86b-bac8-4a2b-a96c-9e8a317deecf"). InnerVolumeSpecName "kube-api-access-lc5qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.311168 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5qc\" (UniqueName: \"kubernetes.io/projected/3609b86b-bac8-4a2b-a96c-9e8a317deecf-kube-api-access-lc5qc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.476850 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dff5-account-create-cf9kx" event={"ID":"b45912f0-ed72-4751-9543-bacd041baba0","Type":"ContainerDied","Data":"d60095bfbad833da13cdd2fbe6eb91dc63919d125917715b4fe61ac239412aea"} Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.476895 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dff5-account-create-cf9kx" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.476908 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d60095bfbad833da13cdd2fbe6eb91dc63919d125917715b4fe61ac239412aea" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.480551 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b28-account-create-cpqx4" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.480602 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b28-account-create-cpqx4" event={"ID":"87ca1241-a022-4b2a-988f-79e2fd42e6aa","Type":"ContainerDied","Data":"12dead751e579a246941f1b69032cdf0201483186ea2369c606921817caba29e"} Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.480626 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12dead751e579a246941f1b69032cdf0201483186ea2369c606921817caba29e" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.489004 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bb2c-account-create-cxvp9" event={"ID":"3609b86b-bac8-4a2b-a96c-9e8a317deecf","Type":"ContainerDied","Data":"07b83f86f7231a11988bc0d8ac3fa28c65841082dec2870d03f3c7eed724322f"} Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.489048 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07b83f86f7231a11988bc0d8ac3fa28c65841082dec2870d03f3c7eed724322f" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.489166 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bb2c-account-create-cxvp9" Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.500054 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c75ce736-6adb-4999-9db0-92afd4c874a2","Type":"ContainerStarted","Data":"96fc99a3813a59cc93a8df36354415b2374ed786f252ed3e0fb697ea29e0a766"} Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.500159 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerName="glance-httpd" containerID="cri-o://96fc99a3813a59cc93a8df36354415b2374ed786f252ed3e0fb697ea29e0a766" gracePeriod=30 Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.501177 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerName="glance-log" containerID="cri-o://87e659f07a0c49291eeb295e019d8fd0f6487c92fd04fc684400ce1f5a7364c3" gracePeriod=30 Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.515689 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83fd229-408a-4e17-8af6-36374c507013","Type":"ContainerStarted","Data":"044ebf070f949da6d98e6a8a7bd9ec910c4f5d1649eee7a25bbf1696afc38327"} Oct 03 10:01:49 crc kubenswrapper[4990]: I1003 10:01:49.547879 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.547852223 podStartE2EDuration="5.547852223s" podCreationTimestamp="2025-10-03 10:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:49.535126551 +0000 UTC m=+1091.331758408" watchObservedRunningTime="2025-10-03 10:01:49.547852223 +0000 UTC m=+1091.344484080" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.544890 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83fd229-408a-4e17-8af6-36374c507013","Type":"ContainerStarted","Data":"c1d6922c6c10979b7ee2a734833d33ed7130b48baa2a6194a8e4354ae639d6ac"} Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.545134 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e83fd229-408a-4e17-8af6-36374c507013" containerName="glance-log" containerID="cri-o://044ebf070f949da6d98e6a8a7bd9ec910c4f5d1649eee7a25bbf1696afc38327" gracePeriod=30 Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.545353 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e83fd229-408a-4e17-8af6-36374c507013" containerName="glance-httpd" containerID="cri-o://c1d6922c6c10979b7ee2a734833d33ed7130b48baa2a6194a8e4354ae639d6ac" gracePeriod=30 Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.564864 4990 generic.go:334] "Generic (PLEG): container finished" podID="4a09106e-59ee-4666-940e-7ffeaab8f83f" containerID="ef11333bf94355cba01475b1c22a8097568481d7e330f677fc3d3c2369b3bdc4" exitCode=0 Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.564968 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d264n" event={"ID":"4a09106e-59ee-4666-940e-7ffeaab8f83f","Type":"ContainerDied","Data":"ef11333bf94355cba01475b1c22a8097568481d7e330f677fc3d3c2369b3bdc4"} Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.601690 4990 generic.go:334] "Generic (PLEG): container finished" podID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerID="96fc99a3813a59cc93a8df36354415b2374ed786f252ed3e0fb697ea29e0a766" exitCode=0 Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.601731 4990 generic.go:334] "Generic (PLEG): container finished" podID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerID="87e659f07a0c49291eeb295e019d8fd0f6487c92fd04fc684400ce1f5a7364c3" exitCode=143 Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.601758 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c75ce736-6adb-4999-9db0-92afd4c874a2","Type":"ContainerDied","Data":"96fc99a3813a59cc93a8df36354415b2374ed786f252ed3e0fb697ea29e0a766"} Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.601795 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c75ce736-6adb-4999-9db0-92afd4c874a2","Type":"ContainerDied","Data":"87e659f07a0c49291eeb295e019d8fd0f6487c92fd04fc684400ce1f5a7364c3"} Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.609773 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.609746007 podStartE2EDuration="6.609746007s" podCreationTimestamp="2025-10-03 10:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:50.579014643 +0000 UTC m=+1092.375646510" watchObservedRunningTime="2025-10-03 10:01:50.609746007 +0000 UTC m=+1092.406377864" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.655966 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6wbg7"] Oct 03 10:01:50 crc kubenswrapper[4990]: E1003 10:01:50.656330 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ca1241-a022-4b2a-988f-79e2fd42e6aa" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656348 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ca1241-a022-4b2a-988f-79e2fd42e6aa" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: E1003 10:01:50.656366 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8073a358-c08a-4bc3-b147-77e1a60dd54c" containerName="init" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656373 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8073a358-c08a-4bc3-b147-77e1a60dd54c" containerName="init" Oct 03 10:01:50 crc kubenswrapper[4990]: E1003 10:01:50.656396 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3609b86b-bac8-4a2b-a96c-9e8a317deecf" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656403 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3609b86b-bac8-4a2b-a96c-9e8a317deecf" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: E1003 10:01:50.656412 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8073a358-c08a-4bc3-b147-77e1a60dd54c" containerName="dnsmasq-dns" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656418 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8073a358-c08a-4bc3-b147-77e1a60dd54c" containerName="dnsmasq-dns" Oct 03 10:01:50 crc kubenswrapper[4990]: E1003 10:01:50.656429 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45912f0-ed72-4751-9543-bacd041baba0" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656435 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45912f0-ed72-4751-9543-bacd041baba0" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: E1003 10:01:50.656448 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf03a9d-f212-4cf4-b60f-7c3086446697" containerName="init" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656454 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf03a9d-f212-4cf4-b60f-7c3086446697" containerName="init" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656643 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf03a9d-f212-4cf4-b60f-7c3086446697" containerName="init" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656663 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45912f0-ed72-4751-9543-bacd041baba0" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656675 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ca1241-a022-4b2a-988f-79e2fd42e6aa" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656683 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8073a358-c08a-4bc3-b147-77e1a60dd54c" containerName="dnsmasq-dns" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.656696 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3609b86b-bac8-4a2b-a96c-9e8a317deecf" containerName="mariadb-account-create" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.657262 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.659999 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.660319 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ckzgf" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.660558 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.669923 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6wbg7"] Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.747476 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-scripts\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.747660 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-combined-ca-bundle\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.747684 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-config-data\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.747715 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-etc-machine-id\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.747750 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ftlt\" (UniqueName: \"kubernetes.io/projected/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-kube-api-access-7ftlt\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.747774 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-db-sync-config-data\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.849548 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ftlt\" (UniqueName: \"kubernetes.io/projected/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-kube-api-access-7ftlt\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.849598 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-db-sync-config-data\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.849662 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-scripts\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.849736 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-combined-ca-bundle\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.849753 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-config-data\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.849782 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-etc-machine-id\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.849851 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-etc-machine-id\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.862361 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-config-data\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.877827 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-combined-ca-bundle\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.883035 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-db-sync-config-data\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.889433 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-scripts\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.906722 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ftlt\" (UniqueName: \"kubernetes.io/projected/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-kube-api-access-7ftlt\") pod \"cinder-db-sync-6wbg7\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.978046 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vgv57"] Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.979620 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.982171 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.987579 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v5xbp" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.987909 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.988264 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 10:01:50 crc kubenswrapper[4990]: I1003 10:01:50.999816 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vgv57"] Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.052894 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-config\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.052999 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnq88\" (UniqueName: \"kubernetes.io/projected/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-kube-api-access-wnq88\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.053026 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-combined-ca-bundle\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.109162 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jwpr6"] Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.110329 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.120778 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ksd4n" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.121052 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.128922 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jwpr6"] Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.154639 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-config\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.154744 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnq88\" (UniqueName: \"kubernetes.io/projected/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-kube-api-access-wnq88\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.154770 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-combined-ca-bundle\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.162244 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-combined-ca-bundle\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.182196 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnq88\" (UniqueName: \"kubernetes.io/projected/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-kube-api-access-wnq88\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.182851 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-config\") pod \"neutron-db-sync-vgv57\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.256962 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-db-sync-config-data\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.257007 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-combined-ca-bundle\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.257131 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5z58\" (UniqueName: \"kubernetes.io/projected/32563036-2ae2-4a96-8e50-94100964fd6d-kube-api-access-l5z58\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.358731 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5z58\" (UniqueName: \"kubernetes.io/projected/32563036-2ae2-4a96-8e50-94100964fd6d-kube-api-access-l5z58\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.358817 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-db-sync-config-data\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.358850 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-combined-ca-bundle\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.363072 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-combined-ca-bundle\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.363256 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vgv57" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.363373 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-db-sync-config-data\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.380487 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5z58\" (UniqueName: \"kubernetes.io/projected/32563036-2ae2-4a96-8e50-94100964fd6d-kube-api-access-l5z58\") pod \"barbican-db-sync-jwpr6\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.437199 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.615980 4990 generic.go:334] "Generic (PLEG): container finished" podID="e83fd229-408a-4e17-8af6-36374c507013" containerID="c1d6922c6c10979b7ee2a734833d33ed7130b48baa2a6194a8e4354ae639d6ac" exitCode=0 Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.616004 4990 generic.go:334] "Generic (PLEG): container finished" podID="e83fd229-408a-4e17-8af6-36374c507013" containerID="044ebf070f949da6d98e6a8a7bd9ec910c4f5d1649eee7a25bbf1696afc38327" exitCode=143 Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.616089 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83fd229-408a-4e17-8af6-36374c507013","Type":"ContainerDied","Data":"c1d6922c6c10979b7ee2a734833d33ed7130b48baa2a6194a8e4354ae639d6ac"} Oct 03 10:01:51 crc kubenswrapper[4990]: I1003 10:01:51.616153 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83fd229-408a-4e17-8af6-36374c507013","Type":"ContainerDied","Data":"044ebf070f949da6d98e6a8a7bd9ec910c4f5d1649eee7a25bbf1696afc38327"} Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.829439 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.948875 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-combined-ca-bundle\") pod \"4a09106e-59ee-4666-940e-7ffeaab8f83f\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.948948 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-config-data\") pod \"4a09106e-59ee-4666-940e-7ffeaab8f83f\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.949006 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-fernet-keys\") pod \"4a09106e-59ee-4666-940e-7ffeaab8f83f\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.949111 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf5xc\" (UniqueName: \"kubernetes.io/projected/4a09106e-59ee-4666-940e-7ffeaab8f83f-kube-api-access-cf5xc\") pod \"4a09106e-59ee-4666-940e-7ffeaab8f83f\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.949168 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-credential-keys\") pod \"4a09106e-59ee-4666-940e-7ffeaab8f83f\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.949194 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-scripts\") pod \"4a09106e-59ee-4666-940e-7ffeaab8f83f\" (UID: \"4a09106e-59ee-4666-940e-7ffeaab8f83f\") " Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.954830 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-scripts" (OuterVolumeSpecName: "scripts") pod "4a09106e-59ee-4666-940e-7ffeaab8f83f" (UID: "4a09106e-59ee-4666-940e-7ffeaab8f83f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.954894 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4a09106e-59ee-4666-940e-7ffeaab8f83f" (UID: "4a09106e-59ee-4666-940e-7ffeaab8f83f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.968780 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4a09106e-59ee-4666-940e-7ffeaab8f83f" (UID: "4a09106e-59ee-4666-940e-7ffeaab8f83f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.977213 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a09106e-59ee-4666-940e-7ffeaab8f83f-kube-api-access-cf5xc" (OuterVolumeSpecName: "kube-api-access-cf5xc") pod "4a09106e-59ee-4666-940e-7ffeaab8f83f" (UID: "4a09106e-59ee-4666-940e-7ffeaab8f83f"). InnerVolumeSpecName "kube-api-access-cf5xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:54 crc kubenswrapper[4990]: I1003 10:01:54.996488 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-config-data" (OuterVolumeSpecName: "config-data") pod "4a09106e-59ee-4666-940e-7ffeaab8f83f" (UID: "4a09106e-59ee-4666-940e-7ffeaab8f83f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.015958 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a09106e-59ee-4666-940e-7ffeaab8f83f" (UID: "4a09106e-59ee-4666-940e-7ffeaab8f83f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.051628 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf5xc\" (UniqueName: \"kubernetes.io/projected/4a09106e-59ee-4666-940e-7ffeaab8f83f-kube-api-access-cf5xc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.051654 4990 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.051664 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.051672 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.051681 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.051689 4990 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a09106e-59ee-4666-940e-7ffeaab8f83f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.087813 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.219303 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.254606 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c75ce736-6adb-4999-9db0-92afd4c874a2\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.254670 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-httpd-run\") pod \"c75ce736-6adb-4999-9db0-92afd4c874a2\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.254718 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-scripts\") pod \"c75ce736-6adb-4999-9db0-92afd4c874a2\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.254737 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-logs\") pod \"c75ce736-6adb-4999-9db0-92afd4c874a2\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.254790 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-config-data\") pod \"c75ce736-6adb-4999-9db0-92afd4c874a2\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.254814 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-combined-ca-bundle\") pod \"c75ce736-6adb-4999-9db0-92afd4c874a2\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.254838 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzdxw\" (UniqueName: \"kubernetes.io/projected/c75ce736-6adb-4999-9db0-92afd4c874a2-kube-api-access-lzdxw\") pod \"c75ce736-6adb-4999-9db0-92afd4c874a2\" (UID: \"c75ce736-6adb-4999-9db0-92afd4c874a2\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.259159 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-scripts" (OuterVolumeSpecName: "scripts") pod "c75ce736-6adb-4999-9db0-92afd4c874a2" (UID: "c75ce736-6adb-4999-9db0-92afd4c874a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.259163 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75ce736-6adb-4999-9db0-92afd4c874a2-kube-api-access-lzdxw" (OuterVolumeSpecName: "kube-api-access-lzdxw") pod "c75ce736-6adb-4999-9db0-92afd4c874a2" (UID: "c75ce736-6adb-4999-9db0-92afd4c874a2"). InnerVolumeSpecName "kube-api-access-lzdxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.259428 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c75ce736-6adb-4999-9db0-92afd4c874a2" (UID: "c75ce736-6adb-4999-9db0-92afd4c874a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.259460 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-logs" (OuterVolumeSpecName: "logs") pod "c75ce736-6adb-4999-9db0-92afd4c874a2" (UID: "c75ce736-6adb-4999-9db0-92afd4c874a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.261970 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c75ce736-6adb-4999-9db0-92afd4c874a2" (UID: "c75ce736-6adb-4999-9db0-92afd4c874a2"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.286617 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c75ce736-6adb-4999-9db0-92afd4c874a2" (UID: "c75ce736-6adb-4999-9db0-92afd4c874a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.304194 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.304341 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.307333 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-config-data" (OuterVolumeSpecName: "config-data") pod "c75ce736-6adb-4999-9db0-92afd4c874a2" (UID: "c75ce736-6adb-4999-9db0-92afd4c874a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.356470 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-httpd-run\") pod \"e83fd229-408a-4e17-8af6-36374c507013\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.356585 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lxbk\" (UniqueName: \"kubernetes.io/projected/e83fd229-408a-4e17-8af6-36374c507013-kube-api-access-8lxbk\") pod \"e83fd229-408a-4e17-8af6-36374c507013\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.356609 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e83fd229-408a-4e17-8af6-36374c507013\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.356635 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-config-data\") pod \"e83fd229-408a-4e17-8af6-36374c507013\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.356683 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-logs\") pod \"e83fd229-408a-4e17-8af6-36374c507013\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.356829 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-scripts\") pod \"e83fd229-408a-4e17-8af6-36374c507013\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.356855 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-combined-ca-bundle\") pod \"e83fd229-408a-4e17-8af6-36374c507013\" (UID: \"e83fd229-408a-4e17-8af6-36374c507013\") " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.356969 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e83fd229-408a-4e17-8af6-36374c507013" (UID: "e83fd229-408a-4e17-8af6-36374c507013"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.357258 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.357281 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.357294 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.357308 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzdxw\" (UniqueName: \"kubernetes.io/projected/c75ce736-6adb-4999-9db0-92afd4c874a2-kube-api-access-lzdxw\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.357332 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.357346 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c75ce736-6adb-4999-9db0-92afd4c874a2-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.357358 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.357368 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c75ce736-6adb-4999-9db0-92afd4c874a2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.360782 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-logs" (OuterVolumeSpecName: "logs") pod "e83fd229-408a-4e17-8af6-36374c507013" (UID: "e83fd229-408a-4e17-8af6-36374c507013"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.361377 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83fd229-408a-4e17-8af6-36374c507013-kube-api-access-8lxbk" (OuterVolumeSpecName: "kube-api-access-8lxbk") pod "e83fd229-408a-4e17-8af6-36374c507013" (UID: "e83fd229-408a-4e17-8af6-36374c507013"). InnerVolumeSpecName "kube-api-access-8lxbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.364074 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "e83fd229-408a-4e17-8af6-36374c507013" (UID: "e83fd229-408a-4e17-8af6-36374c507013"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.364390 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-scripts" (OuterVolumeSpecName: "scripts") pod "e83fd229-408a-4e17-8af6-36374c507013" (UID: "e83fd229-408a-4e17-8af6-36374c507013"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.380935 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jwpr6"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.382614 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.386915 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vgv57"] Oct 03 10:01:55 crc kubenswrapper[4990]: W1003 10:01:55.390821 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda91f0bdc_c2dc_42e1_9a2e_c2b7b45c1746.slice/crio-221bed214fa8f1dd715a089964fdb8a6cb567405007a6f5fe09f6a2c8513ad56 WatchSource:0}: Error finding container 221bed214fa8f1dd715a089964fdb8a6cb567405007a6f5fe09f6a2c8513ad56: Status 404 returned error can't find the container with id 221bed214fa8f1dd715a089964fdb8a6cb567405007a6f5fe09f6a2c8513ad56 Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.396152 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e83fd229-408a-4e17-8af6-36374c507013" (UID: "e83fd229-408a-4e17-8af6-36374c507013"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.413916 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-config-data" (OuterVolumeSpecName: "config-data") pod "e83fd229-408a-4e17-8af6-36374c507013" (UID: "e83fd229-408a-4e17-8af6-36374c507013"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.459273 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.459316 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.459334 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.459347 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lxbk\" (UniqueName: \"kubernetes.io/projected/e83fd229-408a-4e17-8af6-36374c507013-kube-api-access-8lxbk\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.459389 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.459404 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e83fd229-408a-4e17-8af6-36374c507013-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.459416 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e83fd229-408a-4e17-8af6-36374c507013-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.489758 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.525956 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6wbg7"] Oct 03 10:01:55 crc kubenswrapper[4990]: W1003 10:01:55.534326 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod294e0d0f_2fbd_42f1_90ff_af3c4188f2f2.slice/crio-bcc02ceee5c553e1d682afff624d990f5fcc629f874a7300ab982f0d2d1db0a8 WatchSource:0}: Error finding container bcc02ceee5c553e1d682afff624d990f5fcc629f874a7300ab982f0d2d1db0a8: Status 404 returned error can't find the container with id bcc02ceee5c553e1d682afff624d990f5fcc629f874a7300ab982f0d2d1db0a8 Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.560476 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.577684 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.661575 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-4t8d6"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.662958 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" podUID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" containerName="dnsmasq-dns" containerID="cri-o://3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3" gracePeriod=10 Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.700093 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c75ce736-6adb-4999-9db0-92afd4c874a2","Type":"ContainerDied","Data":"12dfb8d9e8944b1ce6bdc7d07fea2fcf98c57dcde6b43da896efddc46e0c3dc0"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.700183 4990 scope.go:117] "RemoveContainer" containerID="96fc99a3813a59cc93a8df36354415b2374ed786f252ed3e0fb697ea29e0a766" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.700364 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.711766 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.711779 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e83fd229-408a-4e17-8af6-36374c507013","Type":"ContainerDied","Data":"9f22c6ea783bc85f9e334e3b180eaed55240801e6e4f7222612a3d2f4031d05f"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.726800 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerStarted","Data":"3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.727990 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vgv57" event={"ID":"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746","Type":"ContainerStarted","Data":"18376407c1f5e9a0a8cae570c29ff378a32d84fb0a1eae862251a332518d5ed9"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.728015 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vgv57" event={"ID":"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746","Type":"ContainerStarted","Data":"221bed214fa8f1dd715a089964fdb8a6cb567405007a6f5fe09f6a2c8513ad56"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.733630 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d264n" event={"ID":"4a09106e-59ee-4666-940e-7ffeaab8f83f","Type":"ContainerDied","Data":"3bbddb07063ac561ffe6d788854a762f84ad40467c8428c04910573735137069"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.733671 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bbddb07063ac561ffe6d788854a762f84ad40467c8428c04910573735137069" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.733733 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d264n" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.738384 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jwpr6" event={"ID":"32563036-2ae2-4a96-8e50-94100964fd6d","Type":"ContainerStarted","Data":"bd4b572f884000b6ac3ad785f4be8a7c6fe1e2650cff336a90a0acd515b936e6"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.744709 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ckmxp" event={"ID":"5c8a19e1-8b84-48a7-8dde-f22078695aa9","Type":"ContainerStarted","Data":"f18ee0e55f5d16bc0f91cca6c1eec4df5e8474256ea5b13b1d63ebb6a812ece3"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.746084 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6wbg7" event={"ID":"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2","Type":"ContainerStarted","Data":"bcc02ceee5c553e1d682afff624d990f5fcc629f874a7300ab982f0d2d1db0a8"} Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.748987 4990 scope.go:117] "RemoveContainer" containerID="87e659f07a0c49291eeb295e019d8fd0f6487c92fd04fc684400ce1f5a7364c3" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.750731 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vgv57" podStartSLOduration=5.750718582 podStartE2EDuration="5.750718582s" podCreationTimestamp="2025-10-03 10:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:55.744754566 +0000 UTC m=+1097.541386423" watchObservedRunningTime="2025-10-03 10:01:55.750718582 +0000 UTC m=+1097.547350439" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.808041 4990 scope.go:117] "RemoveContainer" containerID="c1d6922c6c10979b7ee2a734833d33ed7130b48baa2a6194a8e4354ae639d6ac" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.816893 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.851708 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.871180 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:55 crc kubenswrapper[4990]: E1003 10:01:55.872172 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerName="glance-log" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872187 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerName="glance-log" Oct 03 10:01:55 crc kubenswrapper[4990]: E1003 10:01:55.872220 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83fd229-408a-4e17-8af6-36374c507013" containerName="glance-httpd" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872228 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83fd229-408a-4e17-8af6-36374c507013" containerName="glance-httpd" Oct 03 10:01:55 crc kubenswrapper[4990]: E1003 10:01:55.872244 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83fd229-408a-4e17-8af6-36374c507013" containerName="glance-log" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872251 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83fd229-408a-4e17-8af6-36374c507013" containerName="glance-log" Oct 03 10:01:55 crc kubenswrapper[4990]: E1003 10:01:55.872265 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a09106e-59ee-4666-940e-7ffeaab8f83f" containerName="keystone-bootstrap" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872272 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a09106e-59ee-4666-940e-7ffeaab8f83f" containerName="keystone-bootstrap" Oct 03 10:01:55 crc kubenswrapper[4990]: E1003 10:01:55.872281 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerName="glance-httpd" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872288 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerName="glance-httpd" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872459 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerName="glance-log" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872475 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a09106e-59ee-4666-940e-7ffeaab8f83f" containerName="keystone-bootstrap" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872485 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" containerName="glance-httpd" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872497 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83fd229-408a-4e17-8af6-36374c507013" containerName="glance-httpd" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.872526 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83fd229-408a-4e17-8af6-36374c507013" containerName="glance-log" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.873659 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.881969 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.882160 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.882956 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.883225 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bw7f6" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.884321 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ckmxp" podStartSLOduration=2.418589941 podStartE2EDuration="10.884307889s" podCreationTimestamp="2025-10-03 10:01:45 +0000 UTC" firstStartedPulling="2025-10-03 10:01:46.324845576 +0000 UTC m=+1088.121477433" lastFinishedPulling="2025-10-03 10:01:54.790563524 +0000 UTC m=+1096.587195381" observedRunningTime="2025-10-03 10:01:55.785983087 +0000 UTC m=+1097.582614964" watchObservedRunningTime="2025-10-03 10:01:55.884307889 +0000 UTC m=+1097.680939746" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.892232 4990 scope.go:117] "RemoveContainer" containerID="044ebf070f949da6d98e6a8a7bd9ec910c4f5d1649eee7a25bbf1696afc38327" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.896255 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.907656 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.923149 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.931228 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.932547 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.936009 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.936185 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.943752 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.972376 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.972459 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.972639 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-logs\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.972670 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.972718 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.972787 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.972812 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwks9\" (UniqueName: \"kubernetes.io/projected/d9b7ab18-051d-4512-a87f-962750e82da6-kube-api-access-gwks9\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.972889 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.985766 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-d264n"] Oct 03 10:01:55 crc kubenswrapper[4990]: I1003 10:01:55.994891 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-d264n"] Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078577 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078648 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078683 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078748 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078788 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-logs\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078821 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078849 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078871 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078903 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078928 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-logs\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.078980 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.079007 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwks9\" (UniqueName: \"kubernetes.io/projected/d9b7ab18-051d-4512-a87f-962750e82da6-kube-api-access-gwks9\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.079040 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.079067 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.079095 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9pkx\" (UniqueName: \"kubernetes.io/projected/c4d40201-da82-4b6e-b9b1-acaab48c2885-kube-api-access-t9pkx\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.079861 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.080072 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-logs\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.080662 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.080715 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.083085 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bnq5z"] Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.086592 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.089446 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.093250 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.093774 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.106345 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.117373 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bnq5z"] Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.123911 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.124168 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.124727 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.126053 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kk8mw" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.131690 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwks9\" (UniqueName: \"kubernetes.io/projected/d9b7ab18-051d-4512-a87f-962750e82da6-kube-api-access-gwks9\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.146608 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185237 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185315 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185342 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185377 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-logs\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185433 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2nzx\" (UniqueName: \"kubernetes.io/projected/007a3204-42d2-4769-b267-c80963e1810e-kube-api-access-f2nzx\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185458 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185489 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9pkx\" (UniqueName: \"kubernetes.io/projected/c4d40201-da82-4b6e-b9b1-acaab48c2885-kube-api-access-t9pkx\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185550 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-combined-ca-bundle\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185588 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-config-data\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185617 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185639 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-credential-keys\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185665 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185703 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-scripts\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.185732 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-fernet-keys\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.192418 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-logs\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.194998 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.196848 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.196954 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.200973 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.201436 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.203282 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.208973 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9pkx\" (UniqueName: \"kubernetes.io/projected/c4d40201-da82-4b6e-b9b1-acaab48c2885-kube-api-access-t9pkx\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.213755 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.254478 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.263096 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.287171 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-fernet-keys\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.287335 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2nzx\" (UniqueName: \"kubernetes.io/projected/007a3204-42d2-4769-b267-c80963e1810e-kube-api-access-f2nzx\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.287401 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-combined-ca-bundle\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.287442 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-config-data\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.287471 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-credential-keys\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.287531 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-scripts\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.298857 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.299311 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-combined-ca-bundle\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.300158 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-fernet-keys\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.300689 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-scripts\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.300801 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-config-data\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.301342 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-credential-keys\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.317090 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2nzx\" (UniqueName: \"kubernetes.io/projected/007a3204-42d2-4769-b267-c80963e1810e-kube-api-access-f2nzx\") pod \"keystone-bootstrap-bnq5z\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.389030 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-sb\") pod \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.389154 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-dns-svc\") pod \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.389219 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsbsc\" (UniqueName: \"kubernetes.io/projected/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-kube-api-access-tsbsc\") pod \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.389277 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-nb\") pod \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.389347 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-config\") pod \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\" (UID: \"6630a347-e78f-4354-a4d2-2ba01ce1ab0c\") " Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.398653 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-kube-api-access-tsbsc" (OuterVolumeSpecName: "kube-api-access-tsbsc") pod "6630a347-e78f-4354-a4d2-2ba01ce1ab0c" (UID: "6630a347-e78f-4354-a4d2-2ba01ce1ab0c"). InnerVolumeSpecName "kube-api-access-tsbsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.460350 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6630a347-e78f-4354-a4d2-2ba01ce1ab0c" (UID: "6630a347-e78f-4354-a4d2-2ba01ce1ab0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.465949 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6630a347-e78f-4354-a4d2-2ba01ce1ab0c" (UID: "6630a347-e78f-4354-a4d2-2ba01ce1ab0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.467041 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-config" (OuterVolumeSpecName: "config") pod "6630a347-e78f-4354-a4d2-2ba01ce1ab0c" (UID: "6630a347-e78f-4354-a4d2-2ba01ce1ab0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.491034 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.491297 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.491312 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.491326 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsbsc\" (UniqueName: \"kubernetes.io/projected/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-kube-api-access-tsbsc\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.518544 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6630a347-e78f-4354-a4d2-2ba01ce1ab0c" (UID: "6630a347-e78f-4354-a4d2-2ba01ce1ab0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.536121 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.593640 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6630a347-e78f-4354-a4d2-2ba01ce1ab0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.759574 4990 generic.go:334] "Generic (PLEG): container finished" podID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" containerID="3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3" exitCode=0 Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.759744 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.760494 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" event={"ID":"6630a347-e78f-4354-a4d2-2ba01ce1ab0c","Type":"ContainerDied","Data":"3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3"} Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.760568 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-4t8d6" event={"ID":"6630a347-e78f-4354-a4d2-2ba01ce1ab0c","Type":"ContainerDied","Data":"e3e486c0b0f3632ca41794405d34dc7a483f79f5416821d9250871dab4f65109"} Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.760604 4990 scope.go:117] "RemoveContainer" containerID="3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.816185 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-4t8d6"] Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.825026 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-4t8d6"] Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.901088 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a09106e-59ee-4666-940e-7ffeaab8f83f" path="/var/lib/kubelet/pods/4a09106e-59ee-4666-940e-7ffeaab8f83f/volumes" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.901775 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" path="/var/lib/kubelet/pods/6630a347-e78f-4354-a4d2-2ba01ce1ab0c/volumes" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.902706 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75ce736-6adb-4999-9db0-92afd4c874a2" path="/var/lib/kubelet/pods/c75ce736-6adb-4999-9db0-92afd4c874a2/volumes" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.904170 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83fd229-408a-4e17-8af6-36374c507013" path="/var/lib/kubelet/pods/e83fd229-408a-4e17-8af6-36374c507013/volumes" Oct 03 10:01:56 crc kubenswrapper[4990]: I1003 10:01:56.929109 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.086445 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.181330 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bnq5z"] Oct 03 10:01:57 crc kubenswrapper[4990]: W1003 10:01:57.457117 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4d40201_da82_4b6e_b9b1_acaab48c2885.slice/crio-e2ad86cae88692b3f40c6dc581acaebfeb64760fdba457409aa94676a1ae4d66 WatchSource:0}: Error finding container e2ad86cae88692b3f40c6dc581acaebfeb64760fdba457409aa94676a1ae4d66: Status 404 returned error can't find the container with id e2ad86cae88692b3f40c6dc581acaebfeb64760fdba457409aa94676a1ae4d66 Oct 03 10:01:57 crc kubenswrapper[4990]: W1003 10:01:57.471197 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod007a3204_42d2_4769_b267_c80963e1810e.slice/crio-7923eb7947fbbdcba6ee6881479b10dcbaa2c199e632d5f619446c8701f4d873 WatchSource:0}: Error finding container 7923eb7947fbbdcba6ee6881479b10dcbaa2c199e632d5f619446c8701f4d873: Status 404 returned error can't find the container with id 7923eb7947fbbdcba6ee6881479b10dcbaa2c199e632d5f619446c8701f4d873 Oct 03 10:01:57 crc kubenswrapper[4990]: W1003 10:01:57.483991 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b7ab18_051d_4512_a87f_962750e82da6.slice/crio-a7d2ad1e3bcf8443837ef75df589d4bd3b46804abe751448f8c7636cfab87696 WatchSource:0}: Error finding container a7d2ad1e3bcf8443837ef75df589d4bd3b46804abe751448f8c7636cfab87696: Status 404 returned error can't find the container with id a7d2ad1e3bcf8443837ef75df589d4bd3b46804abe751448f8c7636cfab87696 Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.504261 4990 scope.go:117] "RemoveContainer" containerID="6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe" Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.686472 4990 scope.go:117] "RemoveContainer" containerID="3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3" Oct 03 10:01:57 crc kubenswrapper[4990]: E1003 10:01:57.689657 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3\": container with ID starting with 3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3 not found: ID does not exist" containerID="3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3" Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.689692 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3"} err="failed to get container status \"3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3\": rpc error: code = NotFound desc = could not find container \"3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3\": container with ID starting with 3a5737bef23a9408bdf8e55af6676df4cdd047bea072c3ab5d098c0dab1976a3 not found: ID does not exist" Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.689716 4990 scope.go:117] "RemoveContainer" containerID="6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe" Oct 03 10:01:57 crc kubenswrapper[4990]: E1003 10:01:57.690236 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe\": container with ID starting with 6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe not found: ID does not exist" containerID="6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe" Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.690287 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe"} err="failed to get container status \"6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe\": rpc error: code = NotFound desc = could not find container \"6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe\": container with ID starting with 6b8622d1b32147df839a99e9f4446b4e762077923e8eb468ed235484c498bcfe not found: ID does not exist" Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.792159 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bnq5z" event={"ID":"007a3204-42d2-4769-b267-c80963e1810e","Type":"ContainerStarted","Data":"7923eb7947fbbdcba6ee6881479b10dcbaa2c199e632d5f619446c8701f4d873"} Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.800776 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4d40201-da82-4b6e-b9b1-acaab48c2885","Type":"ContainerStarted","Data":"e2ad86cae88692b3f40c6dc581acaebfeb64760fdba457409aa94676a1ae4d66"} Oct 03 10:01:57 crc kubenswrapper[4990]: I1003 10:01:57.802286 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9b7ab18-051d-4512-a87f-962750e82da6","Type":"ContainerStarted","Data":"a7d2ad1e3bcf8443837ef75df589d4bd3b46804abe751448f8c7636cfab87696"} Oct 03 10:01:58 crc kubenswrapper[4990]: I1003 10:01:58.824854 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bnq5z" event={"ID":"007a3204-42d2-4769-b267-c80963e1810e","Type":"ContainerStarted","Data":"736cd5636cb503db1231834fc566fac0514837d857cdaa40203947278154656f"} Oct 03 10:01:58 crc kubenswrapper[4990]: I1003 10:01:58.829547 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4d40201-da82-4b6e-b9b1-acaab48c2885","Type":"ContainerStarted","Data":"e278b7414732cf851499cb8238bb151fdad6ab1335eedadf17655c20cf21385c"} Oct 03 10:01:58 crc kubenswrapper[4990]: I1003 10:01:58.834788 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9b7ab18-051d-4512-a87f-962750e82da6","Type":"ContainerStarted","Data":"bf7f01839d28dad19c4310b3009994bb5a5bcbc28458035b4b139da18b099d83"} Oct 03 10:01:58 crc kubenswrapper[4990]: I1003 10:01:58.836996 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerStarted","Data":"b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751"} Oct 03 10:01:58 crc kubenswrapper[4990]: I1003 10:01:58.840635 4990 generic.go:334] "Generic (PLEG): container finished" podID="5c8a19e1-8b84-48a7-8dde-f22078695aa9" containerID="f18ee0e55f5d16bc0f91cca6c1eec4df5e8474256ea5b13b1d63ebb6a812ece3" exitCode=0 Oct 03 10:01:58 crc kubenswrapper[4990]: I1003 10:01:58.840744 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ckmxp" event={"ID":"5c8a19e1-8b84-48a7-8dde-f22078695aa9","Type":"ContainerDied","Data":"f18ee0e55f5d16bc0f91cca6c1eec4df5e8474256ea5b13b1d63ebb6a812ece3"} Oct 03 10:01:58 crc kubenswrapper[4990]: I1003 10:01:58.853533 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bnq5z" podStartSLOduration=2.853493658 podStartE2EDuration="2.853493658s" podCreationTimestamp="2025-10-03 10:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:58.843251795 +0000 UTC m=+1100.639883662" watchObservedRunningTime="2025-10-03 10:01:58.853493658 +0000 UTC m=+1100.650125515" Oct 03 10:01:59 crc kubenswrapper[4990]: I1003 10:01:59.857370 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9b7ab18-051d-4512-a87f-962750e82da6","Type":"ContainerStarted","Data":"cd36141f435da361530acef5da9f4fb0c32f1ae7b9aa07b1f26b3c189bca420a"} Oct 03 10:01:59 crc kubenswrapper[4990]: I1003 10:01:59.861317 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4d40201-da82-4b6e-b9b1-acaab48c2885","Type":"ContainerStarted","Data":"aa9a6c01fafc1b1871bc187c046c06c9b62a3ca43302bbdf8d59afe00dc56ff3"} Oct 03 10:01:59 crc kubenswrapper[4990]: I1003 10:01:59.880577 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.8805541439999995 podStartE2EDuration="4.880554144s" podCreationTimestamp="2025-10-03 10:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:59.875474523 +0000 UTC m=+1101.672106380" watchObservedRunningTime="2025-10-03 10:01:59.880554144 +0000 UTC m=+1101.677186011" Oct 03 10:01:59 crc kubenswrapper[4990]: I1003 10:01:59.916399 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.916373946 podStartE2EDuration="4.916373946s" podCreationTimestamp="2025-10-03 10:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:59.909255323 +0000 UTC m=+1101.705887170" watchObservedRunningTime="2025-10-03 10:01:59.916373946 +0000 UTC m=+1101.713005803" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.164357 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ckmxp" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.224103 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cnl2\" (UniqueName: \"kubernetes.io/projected/5c8a19e1-8b84-48a7-8dde-f22078695aa9-kube-api-access-5cnl2\") pod \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.224241 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a19e1-8b84-48a7-8dde-f22078695aa9-logs\") pod \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.224310 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-config-data\") pod \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.224339 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-combined-ca-bundle\") pod \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.224384 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-scripts\") pod \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\" (UID: \"5c8a19e1-8b84-48a7-8dde-f22078695aa9\") " Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.224769 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c8a19e1-8b84-48a7-8dde-f22078695aa9-logs" (OuterVolumeSpecName: "logs") pod "5c8a19e1-8b84-48a7-8dde-f22078695aa9" (UID: "5c8a19e1-8b84-48a7-8dde-f22078695aa9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.230118 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8a19e1-8b84-48a7-8dde-f22078695aa9-kube-api-access-5cnl2" (OuterVolumeSpecName: "kube-api-access-5cnl2") pod "5c8a19e1-8b84-48a7-8dde-f22078695aa9" (UID: "5c8a19e1-8b84-48a7-8dde-f22078695aa9"). InnerVolumeSpecName "kube-api-access-5cnl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.231753 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-scripts" (OuterVolumeSpecName: "scripts") pod "5c8a19e1-8b84-48a7-8dde-f22078695aa9" (UID: "5c8a19e1-8b84-48a7-8dde-f22078695aa9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.260713 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c8a19e1-8b84-48a7-8dde-f22078695aa9" (UID: "5c8a19e1-8b84-48a7-8dde-f22078695aa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.271616 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-config-data" (OuterVolumeSpecName: "config-data") pod "5c8a19e1-8b84-48a7-8dde-f22078695aa9" (UID: "5c8a19e1-8b84-48a7-8dde-f22078695aa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.327098 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c8a19e1-8b84-48a7-8dde-f22078695aa9-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.327139 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.327154 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.327167 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8a19e1-8b84-48a7-8dde-f22078695aa9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.327180 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cnl2\" (UniqueName: \"kubernetes.io/projected/5c8a19e1-8b84-48a7-8dde-f22078695aa9-kube-api-access-5cnl2\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.907914 4990 generic.go:334] "Generic (PLEG): container finished" podID="007a3204-42d2-4769-b267-c80963e1810e" containerID="736cd5636cb503db1231834fc566fac0514837d857cdaa40203947278154656f" exitCode=0 Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.908002 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bnq5z" event={"ID":"007a3204-42d2-4769-b267-c80963e1810e","Type":"ContainerDied","Data":"736cd5636cb503db1231834fc566fac0514837d857cdaa40203947278154656f"} Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.910399 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ckmxp" event={"ID":"5c8a19e1-8b84-48a7-8dde-f22078695aa9","Type":"ContainerDied","Data":"ba8f3c06dee832a2e6c10bf03950f9ba4f5e29e9dfc5e4cecfa817aa8289e609"} Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.910437 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba8f3c06dee832a2e6c10bf03950f9ba4f5e29e9dfc5e4cecfa817aa8289e609" Oct 03 10:02:03 crc kubenswrapper[4990]: I1003 10:02:03.910474 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ckmxp" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.270476 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59f897c554-pws5p"] Oct 03 10:02:04 crc kubenswrapper[4990]: E1003 10:02:04.270902 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8a19e1-8b84-48a7-8dde-f22078695aa9" containerName="placement-db-sync" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.270920 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8a19e1-8b84-48a7-8dde-f22078695aa9" containerName="placement-db-sync" Oct 03 10:02:04 crc kubenswrapper[4990]: E1003 10:02:04.270934 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" containerName="dnsmasq-dns" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.270940 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" containerName="dnsmasq-dns" Oct 03 10:02:04 crc kubenswrapper[4990]: E1003 10:02:04.270965 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" containerName="init" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.270972 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" containerName="init" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.271141 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6630a347-e78f-4354-a4d2-2ba01ce1ab0c" containerName="dnsmasq-dns" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.271157 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8a19e1-8b84-48a7-8dde-f22078695aa9" containerName="placement-db-sync" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.272160 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.278102 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.278327 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.278583 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.278797 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.279899 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-v92hc" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.293689 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59f897c554-pws5p"] Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.344765 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-config-data\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.344829 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-public-tls-certs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.344891 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-combined-ca-bundle\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.344943 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-internal-tls-certs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.344978 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qc64\" (UniqueName: \"kubernetes.io/projected/5f14bb4e-f980-48fb-bba4-c068419b1975-kube-api-access-4qc64\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.345004 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f14bb4e-f980-48fb-bba4-c068419b1975-logs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.345026 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-scripts\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.446625 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-scripts\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.446691 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-config-data\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.446724 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-public-tls-certs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.446771 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-combined-ca-bundle\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.446849 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-internal-tls-certs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.446876 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qc64\" (UniqueName: \"kubernetes.io/projected/5f14bb4e-f980-48fb-bba4-c068419b1975-kube-api-access-4qc64\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.446903 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f14bb4e-f980-48fb-bba4-c068419b1975-logs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.447444 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f14bb4e-f980-48fb-bba4-c068419b1975-logs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.454115 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-internal-tls-certs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.454500 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-combined-ca-bundle\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.454792 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-scripts\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.462594 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-config-data\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.463211 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-public-tls-certs\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.466494 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qc64\" (UniqueName: \"kubernetes.io/projected/5f14bb4e-f980-48fb-bba4-c068419b1975-kube-api-access-4qc64\") pod \"placement-59f897c554-pws5p\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:04 crc kubenswrapper[4990]: I1003 10:02:04.601648 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.204039 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.204355 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.242881 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.250476 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.264749 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.264794 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.298298 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.327701 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.943492 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.943553 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.943565 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 10:02:06 crc kubenswrapper[4990]: I1003 10:02:06.943573 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 10:02:08 crc kubenswrapper[4990]: I1003 10:02:08.988651 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 10:02:08 crc kubenswrapper[4990]: I1003 10:02:08.988912 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 10:02:08 crc kubenswrapper[4990]: I1003 10:02:08.990314 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 10:02:09 crc kubenswrapper[4990]: I1003 10:02:09.193399 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 10:02:09 crc kubenswrapper[4990]: I1003 10:02:09.193527 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 10:02:09 crc kubenswrapper[4990]: I1003 10:02:09.197099 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 10:02:13 crc kubenswrapper[4990]: I1003 10:02:13.966577 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:02:14 crc kubenswrapper[4990]: E1003 10:02:14.039370 4990 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166" Oct 03 10:02:14 crc kubenswrapper[4990]: E1003 10:02:14.039599 4990 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ftlt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6wbg7_openstack(294e0d0f-2fbd-42f1-90ff-af3c4188f2f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 10:02:14 crc kubenswrapper[4990]: E1003 10:02:14.041335 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6wbg7" podUID="294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.049411 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bnq5z" event={"ID":"007a3204-42d2-4769-b267-c80963e1810e","Type":"ContainerDied","Data":"7923eb7947fbbdcba6ee6881479b10dcbaa2c199e632d5f619446c8701f4d873"} Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.049449 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7923eb7947fbbdcba6ee6881479b10dcbaa2c199e632d5f619446c8701f4d873" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.049502 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bnq5z" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.085969 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-combined-ca-bundle\") pod \"007a3204-42d2-4769-b267-c80963e1810e\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.086025 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-scripts\") pod \"007a3204-42d2-4769-b267-c80963e1810e\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.086080 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-credential-keys\") pod \"007a3204-42d2-4769-b267-c80963e1810e\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.086117 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-config-data\") pod \"007a3204-42d2-4769-b267-c80963e1810e\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.086181 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-fernet-keys\") pod \"007a3204-42d2-4769-b267-c80963e1810e\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.086315 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2nzx\" (UniqueName: \"kubernetes.io/projected/007a3204-42d2-4769-b267-c80963e1810e-kube-api-access-f2nzx\") pod \"007a3204-42d2-4769-b267-c80963e1810e\" (UID: \"007a3204-42d2-4769-b267-c80963e1810e\") " Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.093559 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-scripts" (OuterVolumeSpecName: "scripts") pod "007a3204-42d2-4769-b267-c80963e1810e" (UID: "007a3204-42d2-4769-b267-c80963e1810e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.093818 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007a3204-42d2-4769-b267-c80963e1810e-kube-api-access-f2nzx" (OuterVolumeSpecName: "kube-api-access-f2nzx") pod "007a3204-42d2-4769-b267-c80963e1810e" (UID: "007a3204-42d2-4769-b267-c80963e1810e"). InnerVolumeSpecName "kube-api-access-f2nzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.098554 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "007a3204-42d2-4769-b267-c80963e1810e" (UID: "007a3204-42d2-4769-b267-c80963e1810e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.099646 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "007a3204-42d2-4769-b267-c80963e1810e" (UID: "007a3204-42d2-4769-b267-c80963e1810e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.115702 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-config-data" (OuterVolumeSpecName: "config-data") pod "007a3204-42d2-4769-b267-c80963e1810e" (UID: "007a3204-42d2-4769-b267-c80963e1810e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.117385 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "007a3204-42d2-4769-b267-c80963e1810e" (UID: "007a3204-42d2-4769-b267-c80963e1810e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.188260 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2nzx\" (UniqueName: \"kubernetes.io/projected/007a3204-42d2-4769-b267-c80963e1810e-kube-api-access-f2nzx\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.188299 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.188308 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.188318 4990 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.188326 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:14 crc kubenswrapper[4990]: I1003 10:02:14.188335 4990 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/007a3204-42d2-4769-b267-c80963e1810e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:15 crc kubenswrapper[4990]: E1003 10:02:15.058866 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166\\\"\"" pod="openstack/cinder-db-sync-6wbg7" podUID="294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.155478 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6897c54f48-kp6tm"] Oct 03 10:02:15 crc kubenswrapper[4990]: E1003 10:02:15.155913 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007a3204-42d2-4769-b267-c80963e1810e" containerName="keystone-bootstrap" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.155933 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="007a3204-42d2-4769-b267-c80963e1810e" containerName="keystone-bootstrap" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.156108 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="007a3204-42d2-4769-b267-c80963e1810e" containerName="keystone-bootstrap" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.156730 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.164134 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kk8mw" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.164287 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.164401 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.164542 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.164670 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.164871 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.169049 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6897c54f48-kp6tm"] Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.305327 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-combined-ca-bundle\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.305464 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-config-data\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.305734 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-credential-keys\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.305876 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpg2j\" (UniqueName: \"kubernetes.io/projected/b2d5088c-5854-4bee-9e3c-8198d4b7d377-kube-api-access-wpg2j\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.306026 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-fernet-keys\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.306077 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-scripts\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.306128 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-public-tls-certs\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.306150 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-internal-tls-certs\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.407204 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-combined-ca-bundle\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.407602 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-config-data\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.407722 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-credential-keys\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.407862 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpg2j\" (UniqueName: \"kubernetes.io/projected/b2d5088c-5854-4bee-9e3c-8198d4b7d377-kube-api-access-wpg2j\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.408004 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-fernet-keys\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.408080 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-scripts\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.408155 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-public-tls-certs\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.408222 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-internal-tls-certs\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.414471 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-internal-tls-certs\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.414561 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-fernet-keys\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.414703 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-scripts\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.418371 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-config-data\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.418752 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-public-tls-certs\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.424553 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpg2j\" (UniqueName: \"kubernetes.io/projected/b2d5088c-5854-4bee-9e3c-8198d4b7d377-kube-api-access-wpg2j\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.424739 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-combined-ca-bundle\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.428909 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-credential-keys\") pod \"keystone-6897c54f48-kp6tm\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:15 crc kubenswrapper[4990]: I1003 10:02:15.511627 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:17 crc kubenswrapper[4990]: I1003 10:02:17.434675 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59f897c554-pws5p"] Oct 03 10:02:17 crc kubenswrapper[4990]: W1003 10:02:17.442564 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f14bb4e_f980_48fb_bba4_c068419b1975.slice/crio-9a8c1308c67cc4e4f4b4a8993181d9f75055aafa0aa3a8119196155c514815ba WatchSource:0}: Error finding container 9a8c1308c67cc4e4f4b4a8993181d9f75055aafa0aa3a8119196155c514815ba: Status 404 returned error can't find the container with id 9a8c1308c67cc4e4f4b4a8993181d9f75055aafa0aa3a8119196155c514815ba Oct 03 10:02:17 crc kubenswrapper[4990]: I1003 10:02:17.695208 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6897c54f48-kp6tm"] Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.083253 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6897c54f48-kp6tm" event={"ID":"b2d5088c-5854-4bee-9e3c-8198d4b7d377","Type":"ContainerStarted","Data":"f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835"} Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.083338 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6897c54f48-kp6tm" event={"ID":"b2d5088c-5854-4bee-9e3c-8198d4b7d377","Type":"ContainerStarted","Data":"b6b6e7d4741443d6712fd60c34ea6865f02f11ff44c5f45d9705dcc570ca396f"} Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.083362 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.085371 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerStarted","Data":"1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768"} Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.086802 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jwpr6" event={"ID":"32563036-2ae2-4a96-8e50-94100964fd6d","Type":"ContainerStarted","Data":"235ff221ea4844e3ddd9a837dd1b1ff70199e09ac1b1d727c1c8b458b0e2c2b9"} Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.088616 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f897c554-pws5p" event={"ID":"5f14bb4e-f980-48fb-bba4-c068419b1975","Type":"ContainerStarted","Data":"eca91cab1b64d609b0395ae540c0b555039475f0cf2f921543fe267f50e40ea2"} Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.088680 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f897c554-pws5p" event={"ID":"5f14bb4e-f980-48fb-bba4-c068419b1975","Type":"ContainerStarted","Data":"71ede97e001d771329948c2371baea1801f48fce90e28d8bf76d142a1956d199"} Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.088706 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f897c554-pws5p" event={"ID":"5f14bb4e-f980-48fb-bba4-c068419b1975","Type":"ContainerStarted","Data":"9a8c1308c67cc4e4f4b4a8993181d9f75055aafa0aa3a8119196155c514815ba"} Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.088819 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.088850 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.109034 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6897c54f48-kp6tm" podStartSLOduration=3.109016347 podStartE2EDuration="3.109016347s" podCreationTimestamp="2025-10-03 10:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:18.104222186 +0000 UTC m=+1119.900854043" watchObservedRunningTime="2025-10-03 10:02:18.109016347 +0000 UTC m=+1119.905648204" Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.133881 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jwpr6" podStartSLOduration=5.360699883 podStartE2EDuration="27.133861802s" podCreationTimestamp="2025-10-03 10:01:51 +0000 UTC" firstStartedPulling="2025-10-03 10:01:55.392723702 +0000 UTC m=+1097.189355559" lastFinishedPulling="2025-10-03 10:02:17.165885621 +0000 UTC m=+1118.962517478" observedRunningTime="2025-10-03 10:02:18.127140223 +0000 UTC m=+1119.923772090" watchObservedRunningTime="2025-10-03 10:02:18.133861802 +0000 UTC m=+1119.930493679" Oct 03 10:02:18 crc kubenswrapper[4990]: I1003 10:02:18.146373 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59f897c554-pws5p" podStartSLOduration=14.146354266 podStartE2EDuration="14.146354266s" podCreationTimestamp="2025-10-03 10:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:18.142635582 +0000 UTC m=+1119.939267459" watchObservedRunningTime="2025-10-03 10:02:18.146354266 +0000 UTC m=+1119.942986123" Oct 03 10:02:21 crc kubenswrapper[4990]: I1003 10:02:21.117670 4990 generic.go:334] "Generic (PLEG): container finished" podID="32563036-2ae2-4a96-8e50-94100964fd6d" containerID="235ff221ea4844e3ddd9a837dd1b1ff70199e09ac1b1d727c1c8b458b0e2c2b9" exitCode=0 Oct 03 10:02:21 crc kubenswrapper[4990]: I1003 10:02:21.118006 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jwpr6" event={"ID":"32563036-2ae2-4a96-8e50-94100964fd6d","Type":"ContainerDied","Data":"235ff221ea4844e3ddd9a837dd1b1ff70199e09ac1b1d727c1c8b458b0e2c2b9"} Oct 03 10:02:25 crc kubenswrapper[4990]: I1003 10:02:25.304354 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:02:25 crc kubenswrapper[4990]: I1003 10:02:25.305040 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:02:25 crc kubenswrapper[4990]: I1003 10:02:25.305103 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:02:25 crc kubenswrapper[4990]: I1003 10:02:25.305937 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"175c8521bcb98f1fac547c4c077e9a09006d6ca72a50d133b713f7d6d049ebb8"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:02:25 crc kubenswrapper[4990]: I1003 10:02:25.306038 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://175c8521bcb98f1fac547c4c077e9a09006d6ca72a50d133b713f7d6d049ebb8" gracePeriod=600 Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.170645 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="175c8521bcb98f1fac547c4c077e9a09006d6ca72a50d133b713f7d6d049ebb8" exitCode=0 Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.170695 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"175c8521bcb98f1fac547c4c077e9a09006d6ca72a50d133b713f7d6d049ebb8"} Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.170763 4990 scope.go:117] "RemoveContainer" containerID="f9518ccde27cb06822f3f654faf41c25e94fa95c0d239e9864a7e2a64f0cb2e1" Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.317951 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.429913 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-db-sync-config-data\") pod \"32563036-2ae2-4a96-8e50-94100964fd6d\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.429973 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5z58\" (UniqueName: \"kubernetes.io/projected/32563036-2ae2-4a96-8e50-94100964fd6d-kube-api-access-l5z58\") pod \"32563036-2ae2-4a96-8e50-94100964fd6d\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.430110 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-combined-ca-bundle\") pod \"32563036-2ae2-4a96-8e50-94100964fd6d\" (UID: \"32563036-2ae2-4a96-8e50-94100964fd6d\") " Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.436021 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32563036-2ae2-4a96-8e50-94100964fd6d-kube-api-access-l5z58" (OuterVolumeSpecName: "kube-api-access-l5z58") pod "32563036-2ae2-4a96-8e50-94100964fd6d" (UID: "32563036-2ae2-4a96-8e50-94100964fd6d"). InnerVolumeSpecName "kube-api-access-l5z58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.436024 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "32563036-2ae2-4a96-8e50-94100964fd6d" (UID: "32563036-2ae2-4a96-8e50-94100964fd6d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.460683 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32563036-2ae2-4a96-8e50-94100964fd6d" (UID: "32563036-2ae2-4a96-8e50-94100964fd6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.532658 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.532707 4990 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32563036-2ae2-4a96-8e50-94100964fd6d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:26 crc kubenswrapper[4990]: I1003 10:02:26.532723 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5z58\" (UniqueName: \"kubernetes.io/projected/32563036-2ae2-4a96-8e50-94100964fd6d-kube-api-access-l5z58\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.182631 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"d16d23839f71b10620c63e3b9b22fb6701868b850a588763048d7da4f3291db7"} Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.185062 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerStarted","Data":"12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce"} Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.185223 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="ceilometer-central-agent" containerID="cri-o://3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e" gracePeriod=30 Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.185490 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.185564 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="proxy-httpd" containerID="cri-o://12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce" gracePeriod=30 Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.185616 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="sg-core" containerID="cri-o://1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768" gracePeriod=30 Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.185669 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="ceilometer-notification-agent" containerID="cri-o://b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751" gracePeriod=30 Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.188840 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jwpr6" event={"ID":"32563036-2ae2-4a96-8e50-94100964fd6d","Type":"ContainerDied","Data":"bd4b572f884000b6ac3ad785f4be8a7c6fe1e2650cff336a90a0acd515b936e6"} Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.188878 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd4b572f884000b6ac3ad785f4be8a7c6fe1e2650cff336a90a0acd515b936e6" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.188939 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jwpr6" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.241443 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.684184159 podStartE2EDuration="43.241391682s" podCreationTimestamp="2025-10-03 10:01:44 +0000 UTC" firstStartedPulling="2025-10-03 10:01:46.082604375 +0000 UTC m=+1087.879236232" lastFinishedPulling="2025-10-03 10:02:26.639811898 +0000 UTC m=+1128.436443755" observedRunningTime="2025-10-03 10:02:27.23418243 +0000 UTC m=+1129.030814287" watchObservedRunningTime="2025-10-03 10:02:27.241391682 +0000 UTC m=+1129.038023559" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.671590 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-64b566fdb9-7b8mq"] Oct 03 10:02:27 crc kubenswrapper[4990]: E1003 10:02:27.672449 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32563036-2ae2-4a96-8e50-94100964fd6d" containerName="barbican-db-sync" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.672773 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="32563036-2ae2-4a96-8e50-94100964fd6d" containerName="barbican-db-sync" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.673024 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="32563036-2ae2-4a96-8e50-94100964fd6d" containerName="barbican-db-sync" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.674262 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.680793 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.681032 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.681088 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ksd4n" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.698635 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8499569686-hgsxg"] Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.700176 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.706096 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.714961 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-64b566fdb9-7b8mq"] Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.739552 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8499569686-hgsxg"] Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.757335 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.757384 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data-custom\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.757405 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1021ae3d-46d5-481e-b844-9086f9d8f946-logs\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.757444 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-combined-ca-bundle\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.757476 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x88v7\" (UniqueName: \"kubernetes.io/projected/1021ae3d-46d5-481e-b844-9086f9d8f946-kube-api-access-x88v7\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862485 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data-custom\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862554 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1021ae3d-46d5-481e-b844-9086f9d8f946-logs\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862581 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862623 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-combined-ca-bundle\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862645 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-combined-ca-bundle\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862678 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x88v7\" (UniqueName: \"kubernetes.io/projected/1021ae3d-46d5-481e-b844-9086f9d8f946-kube-api-access-x88v7\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862706 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m87l\" (UniqueName: \"kubernetes.io/projected/e56fc3e5-d30b-4486-978a-46a13a5657e6-kube-api-access-5m87l\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862787 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862817 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e56fc3e5-d30b-4486-978a-46a13a5657e6-logs\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.862838 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data-custom\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.869994 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1021ae3d-46d5-481e-b844-9086f9d8f946-logs\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.878472 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-combined-ca-bundle\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.880320 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data-custom\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.881253 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.904669 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67cb9f45b5-29xjr"] Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.906262 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.935073 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x88v7\" (UniqueName: \"kubernetes.io/projected/1021ae3d-46d5-481e-b844-9086f9d8f946-kube-api-access-x88v7\") pod \"barbican-worker-64b566fdb9-7b8mq\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.952876 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb9f45b5-29xjr"] Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.964541 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data-custom\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.964599 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e56fc3e5-d30b-4486-978a-46a13a5657e6-logs\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.964634 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.964674 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-combined-ca-bundle\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.964715 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m87l\" (UniqueName: \"kubernetes.io/projected/e56fc3e5-d30b-4486-978a-46a13a5657e6-kube-api-access-5m87l\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.966572 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e56fc3e5-d30b-4486-978a-46a13a5657e6-logs\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.983170 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.987374 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-combined-ca-bundle\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:27 crc kubenswrapper[4990]: I1003 10:02:27.990190 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data-custom\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.012253 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m87l\" (UniqueName: \"kubernetes.io/projected/e56fc3e5-d30b-4486-978a-46a13a5657e6-kube-api-access-5m87l\") pod \"barbican-keystone-listener-8499569686-hgsxg\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.017939 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.058180 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.066378 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.066497 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-config\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.066550 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.066606 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlcdc\" (UniqueName: \"kubernetes.io/projected/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-kube-api-access-jlcdc\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.067002 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-svc\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.067065 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.122544 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-f5888695d-8qpvg"] Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.123972 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.132076 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.147749 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f5888695d-8qpvg"] Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.168184 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlcdc\" (UniqueName: \"kubernetes.io/projected/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-kube-api-access-jlcdc\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.168247 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-svc\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.168277 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.168323 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.168375 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-config\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.168391 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.169063 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.169365 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.169602 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-config\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.170024 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-svc\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.170235 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.221362 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlcdc\" (UniqueName: \"kubernetes.io/projected/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-kube-api-access-jlcdc\") pod \"dnsmasq-dns-67cb9f45b5-29xjr\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.247957 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerID="12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce" exitCode=0 Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.247989 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerID="1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768" exitCode=2 Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.247997 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerID="3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e" exitCode=0 Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.248743 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerDied","Data":"12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce"} Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.248825 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerDied","Data":"1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768"} Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.248842 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerDied","Data":"3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e"} Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.259216 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.270344 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data-custom\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.270408 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-combined-ca-bundle\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.270448 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfgmw\" (UniqueName: \"kubernetes.io/projected/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-kube-api-access-lfgmw\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.270628 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-logs\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.270697 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.376109 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.377485 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data-custom\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.377543 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-combined-ca-bundle\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.377639 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfgmw\" (UniqueName: \"kubernetes.io/projected/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-kube-api-access-lfgmw\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.377890 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-logs\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.378416 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-logs\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.396489 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-combined-ca-bundle\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.397248 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.401182 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data-custom\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.403141 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfgmw\" (UniqueName: \"kubernetes.io/projected/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-kube-api-access-lfgmw\") pod \"barbican-api-f5888695d-8qpvg\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.468776 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.812322 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8499569686-hgsxg"] Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.944021 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb9f45b5-29xjr"] Oct 03 10:02:28 crc kubenswrapper[4990]: W1003 10:02:28.949678 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba45b7a6_dfa9_4aa9_ae7b_bcfd0f8f8ffd.slice/crio-343de753ee7b614002cdf85555efdc9ff6cbe82f476091837c1c56fd512d509e WatchSource:0}: Error finding container 343de753ee7b614002cdf85555efdc9ff6cbe82f476091837c1c56fd512d509e: Status 404 returned error can't find the container with id 343de753ee7b614002cdf85555efdc9ff6cbe82f476091837c1c56fd512d509e Oct 03 10:02:28 crc kubenswrapper[4990]: I1003 10:02:28.975911 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-64b566fdb9-7b8mq"] Oct 03 10:02:29 crc kubenswrapper[4990]: I1003 10:02:29.149500 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f5888695d-8qpvg"] Oct 03 10:02:29 crc kubenswrapper[4990]: W1003 10:02:29.159103 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc68b94_34c1_4fe2_8b30_08b0fc7aaaf9.slice/crio-e7b2e6a746f571b8c7ed003e023989026062e0b2154a0803e8a96ba8468d94f8 WatchSource:0}: Error finding container e7b2e6a746f571b8c7ed003e023989026062e0b2154a0803e8a96ba8468d94f8: Status 404 returned error can't find the container with id e7b2e6a746f571b8c7ed003e023989026062e0b2154a0803e8a96ba8468d94f8 Oct 03 10:02:29 crc kubenswrapper[4990]: I1003 10:02:29.272659 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6wbg7" event={"ID":"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2","Type":"ContainerStarted","Data":"035fb51c6a8dbf9baf89703a1824ed1de4a14267cb2db1bae664bb9a91aaa7e9"} Oct 03 10:02:29 crc kubenswrapper[4990]: I1003 10:02:29.274535 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" event={"ID":"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd","Type":"ContainerStarted","Data":"343de753ee7b614002cdf85555efdc9ff6cbe82f476091837c1c56fd512d509e"} Oct 03 10:02:29 crc kubenswrapper[4990]: I1003 10:02:29.275475 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64b566fdb9-7b8mq" event={"ID":"1021ae3d-46d5-481e-b844-9086f9d8f946","Type":"ContainerStarted","Data":"3570d5cd09268ffa0be273b30970ac5d32e3b12e2c307a4fcbde0c24c3d8c6f3"} Oct 03 10:02:29 crc kubenswrapper[4990]: I1003 10:02:29.276553 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f5888695d-8qpvg" event={"ID":"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9","Type":"ContainerStarted","Data":"e7b2e6a746f571b8c7ed003e023989026062e0b2154a0803e8a96ba8468d94f8"} Oct 03 10:02:29 crc kubenswrapper[4990]: I1003 10:02:29.277564 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" event={"ID":"e56fc3e5-d30b-4486-978a-46a13a5657e6","Type":"ContainerStarted","Data":"48515689fb9bb8d1abc710cdbd3893111b9a1c5d49a0cce9338cc76a6c2b0c8f"} Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.288140 4990 generic.go:334] "Generic (PLEG): container finished" podID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerID="47a0977a531901d7abf10b315df58756b9b29badb77ce48e3bf02401eab404f7" exitCode=0 Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.288322 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" event={"ID":"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd","Type":"ContainerDied","Data":"47a0977a531901d7abf10b315df58756b9b29badb77ce48e3bf02401eab404f7"} Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.741581 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6wbg7" podStartSLOduration=8.947609258 podStartE2EDuration="40.741559953s" podCreationTimestamp="2025-10-03 10:01:50 +0000 UTC" firstStartedPulling="2025-10-03 10:01:55.537238696 +0000 UTC m=+1097.333870553" lastFinishedPulling="2025-10-03 10:02:27.331189391 +0000 UTC m=+1129.127821248" observedRunningTime="2025-10-03 10:02:29.302073431 +0000 UTC m=+1131.098705298" watchObservedRunningTime="2025-10-03 10:02:30.741559953 +0000 UTC m=+1132.538191810" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.748038 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7ff64b77bd-5qpwf"] Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.750262 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.752206 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.758452 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.760822 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ff64b77bd-5qpwf"] Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.832527 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebb4021b-c9ef-4b31-864d-d4874b51e47c-logs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.832642 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-internal-tls-certs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.832683 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjj6\" (UniqueName: \"kubernetes.io/projected/ebb4021b-c9ef-4b31-864d-d4874b51e47c-kube-api-access-lxjj6\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.832715 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data-custom\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.833057 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-combined-ca-bundle\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.833209 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-public-tls-certs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.833236 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.935034 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-combined-ca-bundle\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.935118 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-public-tls-certs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.935148 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.935195 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebb4021b-c9ef-4b31-864d-d4874b51e47c-logs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.935273 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-internal-tls-certs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.935308 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjj6\" (UniqueName: \"kubernetes.io/projected/ebb4021b-c9ef-4b31-864d-d4874b51e47c-kube-api-access-lxjj6\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.935338 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data-custom\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.936312 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebb4021b-c9ef-4b31-864d-d4874b51e47c-logs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.943373 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-internal-tls-certs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.943716 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.945419 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data-custom\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.947290 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-combined-ca-bundle\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.949085 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-public-tls-certs\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:30 crc kubenswrapper[4990]: I1003 10:02:30.960525 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjj6\" (UniqueName: \"kubernetes.io/projected/ebb4021b-c9ef-4b31-864d-d4874b51e47c-kube-api-access-lxjj6\") pod \"barbican-api-7ff64b77bd-5qpwf\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:31 crc kubenswrapper[4990]: I1003 10:02:31.069209 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:31 crc kubenswrapper[4990]: I1003 10:02:31.556316 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ff64b77bd-5qpwf"] Oct 03 10:02:31 crc kubenswrapper[4990]: W1003 10:02:31.562944 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebb4021b_c9ef_4b31_864d_d4874b51e47c.slice/crio-1b3039dbb00e8d2505870cd5c3f0794059ebcdb82f117d2e9019b0d60938ce5a WatchSource:0}: Error finding container 1b3039dbb00e8d2505870cd5c3f0794059ebcdb82f117d2e9019b0d60938ce5a: Status 404 returned error can't find the container with id 1b3039dbb00e8d2505870cd5c3f0794059ebcdb82f117d2e9019b0d60938ce5a Oct 03 10:02:32 crc kubenswrapper[4990]: I1003 10:02:32.310481 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff64b77bd-5qpwf" event={"ID":"ebb4021b-c9ef-4b31-864d-d4874b51e47c","Type":"ContainerStarted","Data":"1b3039dbb00e8d2505870cd5c3f0794059ebcdb82f117d2e9019b0d60938ce5a"} Oct 03 10:02:32 crc kubenswrapper[4990]: I1003 10:02:32.970622 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.083494 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrfph\" (UniqueName: \"kubernetes.io/projected/3e8d0256-9c01-46fc-92e3-2e4e87709158-kube-api-access-zrfph\") pod \"3e8d0256-9c01-46fc-92e3-2e4e87709158\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.083957 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-sg-core-conf-yaml\") pod \"3e8d0256-9c01-46fc-92e3-2e4e87709158\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.083990 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-run-httpd\") pod \"3e8d0256-9c01-46fc-92e3-2e4e87709158\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.084017 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-log-httpd\") pod \"3e8d0256-9c01-46fc-92e3-2e4e87709158\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.084071 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-config-data\") pod \"3e8d0256-9c01-46fc-92e3-2e4e87709158\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.084173 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-scripts\") pod \"3e8d0256-9c01-46fc-92e3-2e4e87709158\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.084222 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-combined-ca-bundle\") pod \"3e8d0256-9c01-46fc-92e3-2e4e87709158\" (UID: \"3e8d0256-9c01-46fc-92e3-2e4e87709158\") " Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.087172 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e8d0256-9c01-46fc-92e3-2e4e87709158" (UID: "3e8d0256-9c01-46fc-92e3-2e4e87709158"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.087574 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e8d0256-9c01-46fc-92e3-2e4e87709158" (UID: "3e8d0256-9c01-46fc-92e3-2e4e87709158"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.091089 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-scripts" (OuterVolumeSpecName: "scripts") pod "3e8d0256-9c01-46fc-92e3-2e4e87709158" (UID: "3e8d0256-9c01-46fc-92e3-2e4e87709158"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.095524 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8d0256-9c01-46fc-92e3-2e4e87709158-kube-api-access-zrfph" (OuterVolumeSpecName: "kube-api-access-zrfph") pod "3e8d0256-9c01-46fc-92e3-2e4e87709158" (UID: "3e8d0256-9c01-46fc-92e3-2e4e87709158"). InnerVolumeSpecName "kube-api-access-zrfph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.111388 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e8d0256-9c01-46fc-92e3-2e4e87709158" (UID: "3e8d0256-9c01-46fc-92e3-2e4e87709158"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.181687 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e8d0256-9c01-46fc-92e3-2e4e87709158" (UID: "3e8d0256-9c01-46fc-92e3-2e4e87709158"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.186834 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrfph\" (UniqueName: \"kubernetes.io/projected/3e8d0256-9c01-46fc-92e3-2e4e87709158-kube-api-access-zrfph\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.187878 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.187974 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.188068 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e8d0256-9c01-46fc-92e3-2e4e87709158-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.188145 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.188230 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.231738 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-config-data" (OuterVolumeSpecName: "config-data") pod "3e8d0256-9c01-46fc-92e3-2e4e87709158" (UID: "3e8d0256-9c01-46fc-92e3-2e4e87709158"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.290110 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0256-9c01-46fc-92e3-2e4e87709158-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.320498 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f5888695d-8qpvg" event={"ID":"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9","Type":"ContainerStarted","Data":"2cded7f28a6d70eec113a2c1f2cd4403da85f5f7ed00423c09b48119b4dd6575"} Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.320564 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f5888695d-8qpvg" event={"ID":"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9","Type":"ContainerStarted","Data":"5709db94fbf37881b7dcb88ca8ea55ff5d3a16e10fcac9dff43ec457d9265d4c"} Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.320990 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.323091 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" event={"ID":"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd","Type":"ContainerStarted","Data":"e371d55bd10e847369d548faf700471f9d864f16ce92c8a578247b592766ceb4"} Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.323347 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.325766 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff64b77bd-5qpwf" event={"ID":"ebb4021b-c9ef-4b31-864d-d4874b51e47c","Type":"ContainerStarted","Data":"dd31dfd8ea225e9fe468b78acfe34ecc14ae0b5990b5d3fa0f4371f27dad6f4c"} Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.325795 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff64b77bd-5qpwf" event={"ID":"ebb4021b-c9ef-4b31-864d-d4874b51e47c","Type":"ContainerStarted","Data":"2f9c4a7944eae6a958448c1d084cbafac9f94669c8762fb52bb88d1d7c1f256d"} Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.326179 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.326677 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.329361 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerID="b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751" exitCode=0 Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.329569 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerDied","Data":"b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751"} Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.329660 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.329893 4990 scope.go:117] "RemoveContainer" containerID="12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.329727 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e8d0256-9c01-46fc-92e3-2e4e87709158","Type":"ContainerDied","Data":"fee84227d4a74583c3cb012766f280ee20445d563e9965efb9a18217f3b966b2"} Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.346649 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-f5888695d-8qpvg" podStartSLOduration=5.346629297 podStartE2EDuration="5.346629297s" podCreationTimestamp="2025-10-03 10:02:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:33.344404061 +0000 UTC m=+1135.141035938" watchObservedRunningTime="2025-10-03 10:02:33.346629297 +0000 UTC m=+1135.143261154" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.364297 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" podStartSLOduration=6.364278471 podStartE2EDuration="6.364278471s" podCreationTimestamp="2025-10-03 10:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:33.359681725 +0000 UTC m=+1135.156313592" watchObservedRunningTime="2025-10-03 10:02:33.364278471 +0000 UTC m=+1135.160910328" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.386119 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7ff64b77bd-5qpwf" podStartSLOduration=3.38610064 podStartE2EDuration="3.38610064s" podCreationTimestamp="2025-10-03 10:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:33.379624737 +0000 UTC m=+1135.176256604" watchObservedRunningTime="2025-10-03 10:02:33.38610064 +0000 UTC m=+1135.182732497" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.403073 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.411338 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.427950 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:02:33 crc kubenswrapper[4990]: E1003 10:02:33.428300 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="ceilometer-notification-agent" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.428317 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="ceilometer-notification-agent" Oct 03 10:02:33 crc kubenswrapper[4990]: E1003 10:02:33.428332 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="sg-core" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.428338 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="sg-core" Oct 03 10:02:33 crc kubenswrapper[4990]: E1003 10:02:33.428367 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="ceilometer-central-agent" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.428374 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="ceilometer-central-agent" Oct 03 10:02:33 crc kubenswrapper[4990]: E1003 10:02:33.428381 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="proxy-httpd" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.428387 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="proxy-httpd" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.428582 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="proxy-httpd" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.428598 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="ceilometer-central-agent" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.428612 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="sg-core" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.428624 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" containerName="ceilometer-notification-agent" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.435861 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.438416 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.438813 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.439031 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.499433 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.500659 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hsr7\" (UniqueName: \"kubernetes.io/projected/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-kube-api-access-5hsr7\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.500894 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-config-data\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.500979 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.501047 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-scripts\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.501188 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-run-httpd\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.501310 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.501448 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-log-httpd\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.603769 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-run-httpd\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.603876 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.603968 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-log-httpd\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.604050 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hsr7\" (UniqueName: \"kubernetes.io/projected/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-kube-api-access-5hsr7\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.604085 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-config-data\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.604104 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.604122 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-scripts\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.604357 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-run-httpd\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.604409 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-log-httpd\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.608024 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.608378 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-config-data\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.608393 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-scripts\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.608722 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.620088 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hsr7\" (UniqueName: \"kubernetes.io/projected/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-kube-api-access-5hsr7\") pod \"ceilometer-0\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.637787 4990 scope.go:117] "RemoveContainer" containerID="1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.832281 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:02:33 crc kubenswrapper[4990]: I1003 10:02:33.966139 4990 scope.go:117] "RemoveContainer" containerID="b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.000963 4990 scope.go:117] "RemoveContainer" containerID="3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.146863 4990 scope.go:117] "RemoveContainer" containerID="12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce" Oct 03 10:02:34 crc kubenswrapper[4990]: E1003 10:02:34.147600 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce\": container with ID starting with 12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce not found: ID does not exist" containerID="12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.147636 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce"} err="failed to get container status \"12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce\": rpc error: code = NotFound desc = could not find container \"12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce\": container with ID starting with 12580da8dc9618407ef00ccaed7364838d9dab9c227f3c9a1a4fcd1f531c19ce not found: ID does not exist" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.147659 4990 scope.go:117] "RemoveContainer" containerID="1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768" Oct 03 10:02:34 crc kubenswrapper[4990]: E1003 10:02:34.148021 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768\": container with ID starting with 1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768 not found: ID does not exist" containerID="1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.148060 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768"} err="failed to get container status \"1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768\": rpc error: code = NotFound desc = could not find container \"1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768\": container with ID starting with 1b1cc72419abcce5865ffb65fbdd752aa696587bfcd346cdf6c70b1cbaf9e768 not found: ID does not exist" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.148087 4990 scope.go:117] "RemoveContainer" containerID="b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751" Oct 03 10:02:34 crc kubenswrapper[4990]: E1003 10:02:34.148490 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751\": container with ID starting with b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751 not found: ID does not exist" containerID="b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.148554 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751"} err="failed to get container status \"b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751\": rpc error: code = NotFound desc = could not find container \"b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751\": container with ID starting with b11a46cb6e409c9bbc9caf65de6d23781ab283ba27ebc289fc93cda9e96e8751 not found: ID does not exist" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.148578 4990 scope.go:117] "RemoveContainer" containerID="3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e" Oct 03 10:02:34 crc kubenswrapper[4990]: E1003 10:02:34.148831 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e\": container with ID starting with 3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e not found: ID does not exist" containerID="3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.148885 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e"} err="failed to get container status \"3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e\": rpc error: code = NotFound desc = could not find container \"3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e\": container with ID starting with 3fff5cd9a71ad02139eb4a3b5fee074fcdd2f2e5067c98a6f515be913326994e not found: ID does not exist" Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.340863 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64b566fdb9-7b8mq" event={"ID":"1021ae3d-46d5-481e-b844-9086f9d8f946","Type":"ContainerStarted","Data":"9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb"} Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.342487 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" event={"ID":"e56fc3e5-d30b-4486-978a-46a13a5657e6","Type":"ContainerStarted","Data":"09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83"} Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.402884 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:02:34 crc kubenswrapper[4990]: W1003 10:02:34.409702 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f2d6e23_3e0a_48b2_8fbf_04975e20081a.slice/crio-1a734cf4fc7e1890b9881856f8140b9b653346554ac4b38da9d7344834c95eaa WatchSource:0}: Error finding container 1a734cf4fc7e1890b9881856f8140b9b653346554ac4b38da9d7344834c95eaa: Status 404 returned error can't find the container with id 1a734cf4fc7e1890b9881856f8140b9b653346554ac4b38da9d7344834c95eaa Oct 03 10:02:34 crc kubenswrapper[4990]: I1003 10:02:34.890768 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8d0256-9c01-46fc-92e3-2e4e87709158" path="/var/lib/kubelet/pods/3e8d0256-9c01-46fc-92e3-2e4e87709158/volumes" Oct 03 10:02:35 crc kubenswrapper[4990]: I1003 10:02:35.353120 4990 generic.go:334] "Generic (PLEG): container finished" podID="a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746" containerID="18376407c1f5e9a0a8cae570c29ff378a32d84fb0a1eae862251a332518d5ed9" exitCode=0 Oct 03 10:02:35 crc kubenswrapper[4990]: I1003 10:02:35.353259 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vgv57" event={"ID":"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746","Type":"ContainerDied","Data":"18376407c1f5e9a0a8cae570c29ff378a32d84fb0a1eae862251a332518d5ed9"} Oct 03 10:02:35 crc kubenswrapper[4990]: I1003 10:02:35.356635 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64b566fdb9-7b8mq" event={"ID":"1021ae3d-46d5-481e-b844-9086f9d8f946","Type":"ContainerStarted","Data":"a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793"} Oct 03 10:02:35 crc kubenswrapper[4990]: I1003 10:02:35.359252 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" event={"ID":"e56fc3e5-d30b-4486-978a-46a13a5657e6","Type":"ContainerStarted","Data":"bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7"} Oct 03 10:02:35 crc kubenswrapper[4990]: I1003 10:02:35.360997 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerStarted","Data":"ae9e95596a2988b574f8b75e1107b343823772356c008e5b6baacd114c7b5416"} Oct 03 10:02:35 crc kubenswrapper[4990]: I1003 10:02:35.361111 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerStarted","Data":"1a734cf4fc7e1890b9881856f8140b9b653346554ac4b38da9d7344834c95eaa"} Oct 03 10:02:35 crc kubenswrapper[4990]: I1003 10:02:35.400013 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" podStartSLOduration=3.232825213 podStartE2EDuration="8.399993241s" podCreationTimestamp="2025-10-03 10:02:27 +0000 UTC" firstStartedPulling="2025-10-03 10:02:28.822501497 +0000 UTC m=+1130.619133354" lastFinishedPulling="2025-10-03 10:02:33.989669515 +0000 UTC m=+1135.786301382" observedRunningTime="2025-10-03 10:02:35.392347729 +0000 UTC m=+1137.188979596" watchObservedRunningTime="2025-10-03 10:02:35.399993241 +0000 UTC m=+1137.196625098" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.124476 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.161104 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-64b566fdb9-7b8mq" podStartSLOduration=4.136228409 podStartE2EDuration="9.161079147s" podCreationTimestamp="2025-10-03 10:02:27 +0000 UTC" firstStartedPulling="2025-10-03 10:02:28.964788746 +0000 UTC m=+1130.761420613" lastFinishedPulling="2025-10-03 10:02:33.989639504 +0000 UTC m=+1135.786271351" observedRunningTime="2025-10-03 10:02:35.420109307 +0000 UTC m=+1137.216741164" watchObservedRunningTime="2025-10-03 10:02:36.161079147 +0000 UTC m=+1137.957711015" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.379503 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerStarted","Data":"baf1277b8ae44323ad33c43a230e061222265255663ebcfe00ca9283472b8d44"} Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.719701 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vgv57" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.779159 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-config\") pod \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.779616 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-combined-ca-bundle\") pod \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.779653 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnq88\" (UniqueName: \"kubernetes.io/projected/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-kube-api-access-wnq88\") pod \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\" (UID: \"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746\") " Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.784181 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-kube-api-access-wnq88" (OuterVolumeSpecName: "kube-api-access-wnq88") pod "a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746" (UID: "a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746"). InnerVolumeSpecName "kube-api-access-wnq88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.811299 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-config" (OuterVolumeSpecName: "config") pod "a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746" (UID: "a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.814315 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746" (UID: "a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.882421 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.882458 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.882492 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnq88\" (UniqueName: \"kubernetes.io/projected/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746-kube-api-access-wnq88\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:36 crc kubenswrapper[4990]: I1003 10:02:36.948604 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59f897c554-pws5p" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.391522 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vgv57" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.391498 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vgv57" event={"ID":"a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746","Type":"ContainerDied","Data":"221bed214fa8f1dd715a089964fdb8a6cb567405007a6f5fe09f6a2c8513ad56"} Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.392030 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="221bed214fa8f1dd715a089964fdb8a6cb567405007a6f5fe09f6a2c8513ad56" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.399420 4990 generic.go:334] "Generic (PLEG): container finished" podID="294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" containerID="035fb51c6a8dbf9baf89703a1824ed1de4a14267cb2db1bae664bb9a91aaa7e9" exitCode=0 Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.400300 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6wbg7" event={"ID":"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2","Type":"ContainerDied","Data":"035fb51c6a8dbf9baf89703a1824ed1de4a14267cb2db1bae664bb9a91aaa7e9"} Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.408208 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerStarted","Data":"aa25799031ab0399b9b2b58da350937c168f461b348e1ba770cf13f3f818ae7b"} Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.642745 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67cb9f45b5-29xjr"] Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.643018 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" podUID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerName="dnsmasq-dns" containerID="cri-o://e371d55bd10e847369d548faf700471f9d864f16ce92c8a578247b592766ceb4" gracePeriod=10 Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.684598 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d999955c7-zszq7"] Oct 03 10:02:37 crc kubenswrapper[4990]: E1003 10:02:37.685159 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746" containerName="neutron-db-sync" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.685183 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746" containerName="neutron-db-sync" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.685406 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746" containerName="neutron-db-sync" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.687081 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.714373 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d999955c7-zszq7"] Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.738483 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-669bd9dbd-82v7b"] Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.740381 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.749185 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.749862 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v5xbp" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.749876 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.750126 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.788770 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-669bd9dbd-82v7b"] Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.803077 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v868v\" (UniqueName: \"kubernetes.io/projected/5adde993-f568-41c7-918c-4b03eb25e560-kube-api-access-v868v\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.803151 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqkrp\" (UniqueName: \"kubernetes.io/projected/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-kube-api-access-wqkrp\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.803299 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-config\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.803545 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-httpd-config\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.803718 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-ovndb-tls-certs\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.803881 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.804072 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-config\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.804104 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.804184 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-svc\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.804316 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.804397 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-combined-ca-bundle\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.906806 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-config\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.906903 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-httpd-config\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907087 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-ovndb-tls-certs\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907179 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907249 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-config\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907266 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907326 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-svc\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907401 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907426 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-combined-ca-bundle\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907468 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v868v\" (UniqueName: \"kubernetes.io/projected/5adde993-f568-41c7-918c-4b03eb25e560-kube-api-access-v868v\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.907497 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqkrp\" (UniqueName: \"kubernetes.io/projected/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-kube-api-access-wqkrp\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.909000 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-config\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.910601 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-swift-storage-0\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.910650 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-sb\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.910820 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-nb\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.911252 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-svc\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.915547 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-ovndb-tls-certs\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.915503 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-combined-ca-bundle\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.916223 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-config\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.916502 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-httpd-config\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.926215 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v868v\" (UniqueName: \"kubernetes.io/projected/5adde993-f568-41c7-918c-4b03eb25e560-kube-api-access-v868v\") pod \"neutron-669bd9dbd-82v7b\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:37 crc kubenswrapper[4990]: I1003 10:02:37.932143 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqkrp\" (UniqueName: \"kubernetes.io/projected/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-kube-api-access-wqkrp\") pod \"dnsmasq-dns-6d999955c7-zszq7\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.028038 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.159131 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.263119 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" podUID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.156:5353: connect: connection refused" Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.461721 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerStarted","Data":"ec3d0a32dbd0c83f3fcf5c62037d956344b45f6170c5fafaaf3a7521e9e088e1"} Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.462077 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.487058 4990 generic.go:334] "Generic (PLEG): container finished" podID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerID="e371d55bd10e847369d548faf700471f9d864f16ce92c8a578247b592766ceb4" exitCode=0 Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.487316 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" event={"ID":"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd","Type":"ContainerDied","Data":"e371d55bd10e847369d548faf700471f9d864f16ce92c8a578247b592766ceb4"} Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.494756 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.893170402 podStartE2EDuration="5.494735003s" podCreationTimestamp="2025-10-03 10:02:33 +0000 UTC" firstStartedPulling="2025-10-03 10:02:34.41250469 +0000 UTC m=+1136.209136547" lastFinishedPulling="2025-10-03 10:02:38.014069301 +0000 UTC m=+1139.810701148" observedRunningTime="2025-10-03 10:02:38.492062666 +0000 UTC m=+1140.288694523" watchObservedRunningTime="2025-10-03 10:02:38.494735003 +0000 UTC m=+1140.291366860" Oct 03 10:02:38 crc kubenswrapper[4990]: W1003 10:02:38.630277 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b7a49c0_0d9a_443b_abbe_c2ce5778d0cd.slice/crio-e7d3715abda8cd7ceb41d2d3f7dca20892664dccbbd084d56c7f06aaaf5e7b97 WatchSource:0}: Error finding container e7d3715abda8cd7ceb41d2d3f7dca20892664dccbbd084d56c7f06aaaf5e7b97: Status 404 returned error can't find the container with id e7d3715abda8cd7ceb41d2d3f7dca20892664dccbbd084d56c7f06aaaf5e7b97 Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.630463 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d999955c7-zszq7"] Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.779945 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.832116 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-nb\") pod \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.832162 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-swift-storage-0\") pod \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.832225 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-sb\") pod \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.832249 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-config\") pod \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.832648 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlcdc\" (UniqueName: \"kubernetes.io/projected/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-kube-api-access-jlcdc\") pod \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.833324 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-svc\") pod \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\" (UID: \"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd\") " Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.870735 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-kube-api-access-jlcdc" (OuterVolumeSpecName: "kube-api-access-jlcdc") pod "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" (UID: "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd"). InnerVolumeSpecName "kube-api-access-jlcdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.936806 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlcdc\" (UniqueName: \"kubernetes.io/projected/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-kube-api-access-jlcdc\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:38 crc kubenswrapper[4990]: I1003 10:02:38.984179 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.060812 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-combined-ca-bundle\") pod \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.061178 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ftlt\" (UniqueName: \"kubernetes.io/projected/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-kube-api-access-7ftlt\") pod \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.061226 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-etc-machine-id\") pod \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.061367 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-config-data\") pod \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.061415 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-scripts\") pod \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.061497 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-db-sync-config-data\") pod \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\" (UID: \"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2\") " Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.066905 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" (UID: "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.075086 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" (UID: "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.086176 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-kube-api-access-7ftlt" (OuterVolumeSpecName: "kube-api-access-7ftlt") pod "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" (UID: "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2"). InnerVolumeSpecName "kube-api-access-7ftlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.087168 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-scripts" (OuterVolumeSpecName: "scripts") pod "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" (UID: "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.093183 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" (UID: "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.136018 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" (UID: "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.142327 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" (UID: "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.153114 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" (UID: "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.166053 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.166088 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.166103 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.166114 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.166128 4990 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.166139 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.166150 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ftlt\" (UniqueName: \"kubernetes.io/projected/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-kube-api-access-7ftlt\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.166165 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.178119 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-669bd9dbd-82v7b"] Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.179671 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" (UID: "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.213695 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-config" (OuterVolumeSpecName: "config") pod "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" (UID: "ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.267970 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.268011 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.314711 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-config-data" (OuterVolumeSpecName: "config-data") pod "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" (UID: "294e0d0f-2fbd-42f1-90ff-af3c4188f2f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.378172 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.504785 4990 generic.go:334] "Generic (PLEG): container finished" podID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" containerID="efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f" exitCode=0 Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.504971 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" event={"ID":"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd","Type":"ContainerDied","Data":"efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f"} Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.505074 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" event={"ID":"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd","Type":"ContainerStarted","Data":"e7d3715abda8cd7ceb41d2d3f7dca20892664dccbbd084d56c7f06aaaf5e7b97"} Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.507810 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6wbg7" event={"ID":"294e0d0f-2fbd-42f1-90ff-af3c4188f2f2","Type":"ContainerDied","Data":"bcc02ceee5c553e1d682afff624d990f5fcc629f874a7300ab982f0d2d1db0a8"} Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.508115 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcc02ceee5c553e1d682afff624d990f5fcc629f874a7300ab982f0d2d1db0a8" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.508242 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6wbg7" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.530675 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669bd9dbd-82v7b" event={"ID":"5adde993-f568-41c7-918c-4b03eb25e560","Type":"ContainerStarted","Data":"ae55941da89fe54a2ad7d0e69456103989d0c9ef6e19f7d5a41847fcb4549e5c"} Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.544072 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.544617 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb9f45b5-29xjr" event={"ID":"ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd","Type":"ContainerDied","Data":"343de753ee7b614002cdf85555efdc9ff6cbe82f476091837c1c56fd512d509e"} Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.544662 4990 scope.go:117] "RemoveContainer" containerID="e371d55bd10e847369d548faf700471f9d864f16ce92c8a578247b592766ceb4" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.577820 4990 scope.go:117] "RemoveContainer" containerID="47a0977a531901d7abf10b315df58756b9b29badb77ce48e3bf02401eab404f7" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.612802 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67cb9f45b5-29xjr"] Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.636378 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67cb9f45b5-29xjr"] Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.739933 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:02:39 crc kubenswrapper[4990]: E1003 10:02:39.740473 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerName="init" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.740493 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerName="init" Oct 03 10:02:39 crc kubenswrapper[4990]: E1003 10:02:39.740633 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerName="dnsmasq-dns" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.740646 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerName="dnsmasq-dns" Oct 03 10:02:39 crc kubenswrapper[4990]: E1003 10:02:39.740702 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" containerName="cinder-db-sync" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.740712 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" containerName="cinder-db-sync" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.740949 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" containerName="cinder-db-sync" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.740981 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" containerName="dnsmasq-dns" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.742224 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.750311 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.750485 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ckzgf" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.761679 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.761933 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.792083 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.793323 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45912a4c-7d65-4d65-846d-03853f467cf6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.793376 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.793427 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.793563 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-scripts\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.793729 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.793922 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2j7g\" (UniqueName: \"kubernetes.io/projected/45912a4c-7d65-4d65-846d-03853f467cf6-kube-api-access-d2j7g\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.886602 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d999955c7-zszq7"] Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.895716 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-scripts\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.896005 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.896162 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2j7g\" (UniqueName: \"kubernetes.io/projected/45912a4c-7d65-4d65-846d-03853f467cf6-kube-api-access-d2j7g\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.896262 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45912a4c-7d65-4d65-846d-03853f467cf6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.896358 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.896459 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.904203 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45912a4c-7d65-4d65-846d-03853f467cf6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.922050 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-scripts\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.923153 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.930198 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2j7g\" (UniqueName: \"kubernetes.io/projected/45912a4c-7d65-4d65-846d-03853f467cf6-kube-api-access-d2j7g\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.936146 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.936212 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f55c55b8f-6gtww"] Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.940824 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " pod="openstack/cinder-scheduler-0" Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.949672 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f55c55b8f-6gtww"] Oct 03 10:02:39 crc kubenswrapper[4990]: I1003 10:02:39.949804 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.003101 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-config\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.003369 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-sb\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.003437 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-nb\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.003453 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmb27\" (UniqueName: \"kubernetes.io/projected/be56917c-d7de-4294-8dfc-97ebddbd3240-kube-api-access-pmb27\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.003653 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-svc\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.003850 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-swift-storage-0\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.081759 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.083438 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.091427 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.103836 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.106249 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-swift-storage-0\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.106311 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-config\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.106349 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-sb\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.106395 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-nb\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.106419 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmb27\" (UniqueName: \"kubernetes.io/projected/be56917c-d7de-4294-8dfc-97ebddbd3240-kube-api-access-pmb27\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.106496 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-svc\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.107636 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-svc\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.108324 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-swift-storage-0\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.109013 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-config\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.109683 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-sb\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.110321 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-nb\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.121770 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.132564 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmb27\" (UniqueName: \"kubernetes.io/projected/be56917c-d7de-4294-8dfc-97ebddbd3240-kube-api-access-pmb27\") pod \"dnsmasq-dns-7f55c55b8f-6gtww\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.211136 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-scripts\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.211217 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data-custom\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.211281 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txptn\" (UniqueName: \"kubernetes.io/projected/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-kube-api-access-txptn\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.211333 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.211370 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.211407 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-logs\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.211532 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.299067 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.339404 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data-custom\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.339467 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txptn\" (UniqueName: \"kubernetes.io/projected/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-kube-api-access-txptn\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.339534 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.339580 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.339603 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-logs\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.339666 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.339729 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-scripts\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.341288 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-logs\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.341342 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.347942 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-scripts\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.350286 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data-custom\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.350744 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.352347 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.377009 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txptn\" (UniqueName: \"kubernetes.io/projected/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-kube-api-access-txptn\") pod \"cinder-api-0\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.428032 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.581350 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" event={"ID":"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd","Type":"ContainerStarted","Data":"14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6"} Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.581506 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" podUID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" containerName="dnsmasq-dns" containerID="cri-o://14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6" gracePeriod=10 Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.581770 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.595036 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669bd9dbd-82v7b" event={"ID":"5adde993-f568-41c7-918c-4b03eb25e560","Type":"ContainerStarted","Data":"fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6"} Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.595246 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669bd9dbd-82v7b" event={"ID":"5adde993-f568-41c7-918c-4b03eb25e560","Type":"ContainerStarted","Data":"9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654"} Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.595950 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.624864 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" podStartSLOduration=3.624842349 podStartE2EDuration="3.624842349s" podCreationTimestamp="2025-10-03 10:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:40.609987835 +0000 UTC m=+1142.406619702" watchObservedRunningTime="2025-10-03 10:02:40.624842349 +0000 UTC m=+1142.421474206" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.637684 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-669bd9dbd-82v7b" podStartSLOduration=3.6376653709999998 podStartE2EDuration="3.637665371s" podCreationTimestamp="2025-10-03 10:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:40.630412879 +0000 UTC m=+1142.427044736" watchObservedRunningTime="2025-10-03 10:02:40.637665371 +0000 UTC m=+1142.434297228" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.797402 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.892156 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd" path="/var/lib/kubelet/pods/ba45b7a6-dfa9-4aa9-ae7b-bcfd0f8f8ffd/volumes" Oct 03 10:02:40 crc kubenswrapper[4990]: I1003 10:02:40.925463 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f55c55b8f-6gtww"] Oct 03 10:02:40 crc kubenswrapper[4990]: W1003 10:02:40.952424 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe56917c_d7de_4294_8dfc_97ebddbd3240.slice/crio-d1fc92db19f1778a792b14cc742653f7729c6967b1ca89fade83fc549ac9cac3 WatchSource:0}: Error finding container d1fc92db19f1778a792b14cc742653f7729c6967b1ca89fade83fc549ac9cac3: Status 404 returned error can't find the container with id d1fc92db19f1778a792b14cc742653f7729c6967b1ca89fade83fc549ac9cac3 Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.079268 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.437829 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.568011 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-swift-storage-0\") pod \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.568074 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-sb\") pod \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.568175 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqkrp\" (UniqueName: \"kubernetes.io/projected/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-kube-api-access-wqkrp\") pod \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.568196 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-config\") pod \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.568373 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-svc\") pod \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.568425 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-nb\") pod \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\" (UID: \"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd\") " Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.594026 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-kube-api-access-wqkrp" (OuterVolumeSpecName: "kube-api-access-wqkrp") pod "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" (UID: "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd"). InnerVolumeSpecName "kube-api-access-wqkrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.644280 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" (UID: "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.660149 4990 generic.go:334] "Generic (PLEG): container finished" podID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" containerID="14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6" exitCode=0 Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.660440 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" event={"ID":"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd","Type":"ContainerDied","Data":"14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6"} Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.660469 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" event={"ID":"2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd","Type":"ContainerDied","Data":"e7d3715abda8cd7ceb41d2d3f7dca20892664dccbbd084d56c7f06aaaf5e7b97"} Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.660484 4990 scope.go:117] "RemoveContainer" containerID="14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.660623 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d999955c7-zszq7" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.665934 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.668404 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" (UID: "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.670433 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqkrp\" (UniqueName: \"kubernetes.io/projected/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-kube-api-access-wqkrp\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.670453 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.670464 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.691410 4990 generic.go:334] "Generic (PLEG): container finished" podID="be56917c-d7de-4294-8dfc-97ebddbd3240" containerID="50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6" exitCode=0 Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.691472 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" event={"ID":"be56917c-d7de-4294-8dfc-97ebddbd3240","Type":"ContainerDied","Data":"50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6"} Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.691495 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" event={"ID":"be56917c-d7de-4294-8dfc-97ebddbd3240","Type":"ContainerStarted","Data":"d1fc92db19f1778a792b14cc742653f7729c6967b1ca89fade83fc549ac9cac3"} Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.697998 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-config" (OuterVolumeSpecName: "config") pod "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" (UID: "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.716869 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" (UID: "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.717347 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d","Type":"ContainerStarted","Data":"60c417c3e1490823b8990a84eb7c6ecc957c11d19eb9193af16a8177594c5b95"} Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.720790 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45912a4c-7d65-4d65-846d-03853f467cf6","Type":"ContainerStarted","Data":"6e24a4a87789d269ec2d138266a19637eadd317c7266d497fd304bf3bcbc3d8e"} Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.755545 4990 scope.go:117] "RemoveContainer" containerID="efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.771878 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.771902 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.772205 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" (UID: "2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.782533 4990 scope.go:117] "RemoveContainer" containerID="14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6" Oct 03 10:02:41 crc kubenswrapper[4990]: E1003 10:02:41.791053 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6\": container with ID starting with 14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6 not found: ID does not exist" containerID="14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.791107 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6"} err="failed to get container status \"14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6\": rpc error: code = NotFound desc = could not find container \"14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6\": container with ID starting with 14a82c4b946e95ee58209e91a7c94ff4d0a686bf259f8f87935cce1b73fab4b6 not found: ID does not exist" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.791137 4990 scope.go:117] "RemoveContainer" containerID="efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f" Oct 03 10:02:41 crc kubenswrapper[4990]: E1003 10:02:41.815119 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f\": container with ID starting with efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f not found: ID does not exist" containerID="efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.815163 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f"} err="failed to get container status \"efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f\": rpc error: code = NotFound desc = could not find container \"efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f\": container with ID starting with efa7d6a66b79ccbce54ec5c079fd0978b6530a121f01be909b25d91f11f7e84f not found: ID does not exist" Oct 03 10:02:41 crc kubenswrapper[4990]: I1003 10:02:41.876660 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:42 crc kubenswrapper[4990]: I1003 10:02:42.014069 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d999955c7-zszq7"] Oct 03 10:02:42 crc kubenswrapper[4990]: I1003 10:02:42.021695 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d999955c7-zszq7"] Oct 03 10:02:42 crc kubenswrapper[4990]: I1003 10:02:42.320240 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:42 crc kubenswrapper[4990]: I1003 10:02:42.737421 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" event={"ID":"be56917c-d7de-4294-8dfc-97ebddbd3240","Type":"ContainerStarted","Data":"69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279"} Oct 03 10:02:42 crc kubenswrapper[4990]: I1003 10:02:42.737580 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:42 crc kubenswrapper[4990]: I1003 10:02:42.741697 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d","Type":"ContainerStarted","Data":"79f663e83e9d8756377ac1a3e42ef117b0d981603cba7596c9d15a97c7868e50"} Oct 03 10:02:42 crc kubenswrapper[4990]: I1003 10:02:42.765801 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" podStartSLOduration=3.765781826 podStartE2EDuration="3.765781826s" podCreationTimestamp="2025-10-03 10:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:42.758348939 +0000 UTC m=+1144.554980796" watchObservedRunningTime="2025-10-03 10:02:42.765781826 +0000 UTC m=+1144.562413683" Oct 03 10:02:42 crc kubenswrapper[4990]: I1003 10:02:42.883334 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" path="/var/lib/kubelet/pods/2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd/volumes" Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.036314 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.619364 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.766309 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerName="cinder-api-log" containerID="cri-o://79f663e83e9d8756377ac1a3e42ef117b0d981603cba7596c9d15a97c7868e50" gracePeriod=30 Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.766810 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerName="cinder-api" containerID="cri-o://4c08e8344db6a48d208b7b72d65983286284591804a2577adf4369e309bb05c5" gracePeriod=30 Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.767025 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d","Type":"ContainerStarted","Data":"4c08e8344db6a48d208b7b72d65983286284591804a2577adf4369e309bb05c5"} Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.767060 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.796190 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.796172027 podStartE2EDuration="3.796172027s" podCreationTimestamp="2025-10-03 10:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:43.793204873 +0000 UTC m=+1145.589836770" watchObservedRunningTime="2025-10-03 10:02:43.796172027 +0000 UTC m=+1145.592803884" Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.800373 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.918810 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-f5888695d-8qpvg"] Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.918987 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-f5888695d-8qpvg" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api-log" containerID="cri-o://2cded7f28a6d70eec113a2c1f2cd4403da85f5f7ed00423c09b48119b4dd6575" gracePeriod=30 Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.919324 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-f5888695d-8qpvg" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api" containerID="cri-o://5709db94fbf37881b7dcb88ca8ea55ff5d3a16e10fcac9dff43ec457d9265d4c" gracePeriod=30 Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.941702 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f5888695d-8qpvg" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.942325 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-f5888695d-8qpvg" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Oct 03 10:02:43 crc kubenswrapper[4990]: I1003 10:02:43.942356 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f5888695d-8qpvg" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.561205 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6778f9d745-ft6gs"] Oct 03 10:02:44 crc kubenswrapper[4990]: E1003 10:02:44.564335 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" containerName="init" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.564358 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" containerName="init" Oct 03 10:02:44 crc kubenswrapper[4990]: E1003 10:02:44.564435 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" containerName="dnsmasq-dns" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.564443 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" containerName="dnsmasq-dns" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.564639 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7a49c0-0d9a-443b-abbe-c2ce5778d0cd" containerName="dnsmasq-dns" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.565545 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.572003 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.572391 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.600710 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6778f9d745-ft6gs"] Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.654109 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.654209 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgcxp\" (UniqueName: \"kubernetes.io/projected/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-kube-api-access-wgcxp\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.654288 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.654371 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-combined-ca-bundle\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.654397 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-public-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.654438 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-internal-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.654500 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-ovndb-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.756936 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.757332 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-combined-ca-bundle\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.757364 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-public-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.757413 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-internal-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.757486 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-ovndb-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.757546 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.757616 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgcxp\" (UniqueName: \"kubernetes.io/projected/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-kube-api-access-wgcxp\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.777319 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-combined-ca-bundle\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.780358 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-ovndb-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.789005 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.792250 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-internal-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.824629 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.824807 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgcxp\" (UniqueName: \"kubernetes.io/projected/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-kube-api-access-wgcxp\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.832204 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-public-tls-certs\") pod \"neutron-6778f9d745-ft6gs\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.834276 4990 generic.go:334] "Generic (PLEG): container finished" podID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerID="4c08e8344db6a48d208b7b72d65983286284591804a2577adf4369e309bb05c5" exitCode=0 Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.834310 4990 generic.go:334] "Generic (PLEG): container finished" podID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerID="79f663e83e9d8756377ac1a3e42ef117b0d981603cba7596c9d15a97c7868e50" exitCode=143 Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.834351 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d","Type":"ContainerDied","Data":"4c08e8344db6a48d208b7b72d65983286284591804a2577adf4369e309bb05c5"} Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.834383 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d","Type":"ContainerDied","Data":"79f663e83e9d8756377ac1a3e42ef117b0d981603cba7596c9d15a97c7868e50"} Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.853324 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45912a4c-7d65-4d65-846d-03853f467cf6","Type":"ContainerStarted","Data":"e2e8bdef768843478c1ffdc2c53fe795c313188a7c0fb2bab2a6f29e6571ddf1"} Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.861181 4990 generic.go:334] "Generic (PLEG): container finished" podID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerID="2cded7f28a6d70eec113a2c1f2cd4403da85f5f7ed00423c09b48119b4dd6575" exitCode=143 Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.861226 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f5888695d-8qpvg" event={"ID":"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9","Type":"ContainerDied","Data":"2cded7f28a6d70eec113a2c1f2cd4403da85f5f7ed00423c09b48119b4dd6575"} Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.918546 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:44 crc kubenswrapper[4990]: I1003 10:02:44.976429 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.068692 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-etc-machine-id\") pod \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.069068 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-scripts\") pod \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.069105 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-logs\") pod \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.069170 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data\") pod \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.069204 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txptn\" (UniqueName: \"kubernetes.io/projected/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-kube-api-access-txptn\") pod \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.069290 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data-custom\") pod \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.069357 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-combined-ca-bundle\") pod \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.068826 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" (UID: "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.070903 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-logs" (OuterVolumeSpecName: "logs") pod "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" (UID: "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.079820 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-kube-api-access-txptn" (OuterVolumeSpecName: "kube-api-access-txptn") pod "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" (UID: "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d"). InnerVolumeSpecName "kube-api-access-txptn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.083580 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" (UID: "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.083782 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-scripts" (OuterVolumeSpecName: "scripts") pod "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" (UID: "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.102359 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" (UID: "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.170223 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data" (OuterVolumeSpecName: "config-data") pod "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" (UID: "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.170854 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data\") pod \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\" (UID: \"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d\") " Oct 03 10:02:45 crc kubenswrapper[4990]: W1003 10:02:45.171118 4990 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d/volumes/kubernetes.io~secret/config-data Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.171166 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data" (OuterVolumeSpecName: "config-data") pod "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" (UID: "28f6c541-6a69-43fe-b948-1a5d2f8b4f1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.171975 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.171997 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.172006 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.172015 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.172024 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.172032 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.172040 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txptn\" (UniqueName: \"kubernetes.io/projected/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d-kube-api-access-txptn\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.516224 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6778f9d745-ft6gs"] Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.871937 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"28f6c541-6a69-43fe-b948-1a5d2f8b4f1d","Type":"ContainerDied","Data":"60c417c3e1490823b8990a84eb7c6ecc957c11d19eb9193af16a8177594c5b95"} Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.872210 4990 scope.go:117] "RemoveContainer" containerID="4c08e8344db6a48d208b7b72d65983286284591804a2577adf4369e309bb05c5" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.872330 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.887357 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45912a4c-7d65-4d65-846d-03853f467cf6","Type":"ContainerStarted","Data":"20b2317c5c0af2fb3a819cf7bdafebe6269a8914cfbc0ffd9b5469ad525d559c"} Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.889820 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6778f9d745-ft6gs" event={"ID":"44d77d08-ad9c-4524-8b12-3d9d204aaf1c","Type":"ContainerStarted","Data":"c1e969493c2c4af3c85759ecb6094d97806d479c080aac159e4825c9e30be4a7"} Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.889862 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6778f9d745-ft6gs" event={"ID":"44d77d08-ad9c-4524-8b12-3d9d204aaf1c","Type":"ContainerStarted","Data":"517dbb72f8123f4ff06aded4a73697fe77e5011aea1784f0e1d1562405d18e44"} Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.927748 4990 scope.go:117] "RemoveContainer" containerID="79f663e83e9d8756377ac1a3e42ef117b0d981603cba7596c9d15a97c7868e50" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.949093 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.665980292 podStartE2EDuration="6.949074546s" podCreationTimestamp="2025-10-03 10:02:39 +0000 UTC" firstStartedPulling="2025-10-03 10:02:40.841104339 +0000 UTC m=+1142.637736196" lastFinishedPulling="2025-10-03 10:02:43.124198603 +0000 UTC m=+1144.920830450" observedRunningTime="2025-10-03 10:02:45.92819488 +0000 UTC m=+1147.724826747" watchObservedRunningTime="2025-10-03 10:02:45.949074546 +0000 UTC m=+1147.745706403" Oct 03 10:02:45 crc kubenswrapper[4990]: I1003 10:02:45.994738 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.014145 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.031850 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:46 crc kubenswrapper[4990]: E1003 10:02:46.032212 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerName="cinder-api" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.032230 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerName="cinder-api" Oct 03 10:02:46 crc kubenswrapper[4990]: E1003 10:02:46.032253 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerName="cinder-api-log" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.032263 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerName="cinder-api-log" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.032433 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerName="cinder-api-log" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.032467 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" containerName="cinder-api" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.033365 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.038190 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.038926 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.039016 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.074895 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091460 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-scripts\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091544 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b682e49-2ca7-4692-b989-28dfbd26163e-logs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091571 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data-custom\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091597 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091624 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091652 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b682e49-2ca7-4692-b989-28dfbd26163e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091681 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091755 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrf9x\" (UniqueName: \"kubernetes.io/projected/7b682e49-2ca7-4692-b989-28dfbd26163e-kube-api-access-lrf9x\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.091777 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: E1003 10:02:46.178069 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28f6c541_6a69_43fe_b948_1a5d2f8b4f1d.slice\": RecentStats: unable to find data in memory cache]" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193608 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-scripts\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193661 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b682e49-2ca7-4692-b989-28dfbd26163e-logs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193683 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data-custom\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193707 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193735 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193762 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b682e49-2ca7-4692-b989-28dfbd26163e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193792 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193854 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrf9x\" (UniqueName: \"kubernetes.io/projected/7b682e49-2ca7-4692-b989-28dfbd26163e-kube-api-access-lrf9x\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.193877 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.195172 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b682e49-2ca7-4692-b989-28dfbd26163e-logs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.196674 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b682e49-2ca7-4692-b989-28dfbd26163e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.201049 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-scripts\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.201239 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.201243 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.201783 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.204256 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data-custom\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.205971 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.215176 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrf9x\" (UniqueName: \"kubernetes.io/projected/7b682e49-2ca7-4692-b989-28dfbd26163e-kube-api-access-lrf9x\") pod \"cinder-api-0\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.366672 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.899368 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f6c541-6a69-43fe-b948-1a5d2f8b4f1d" path="/var/lib/kubelet/pods/28f6c541-6a69-43fe-b948-1a5d2f8b4f1d/volumes" Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.900875 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.902341 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b682e49-2ca7-4692-b989-28dfbd26163e","Type":"ContainerStarted","Data":"47b3ee5fe09c1288dcc55759d5124e0dfde3e2307c5ecb7f0af3741bbfc250d9"} Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.906371 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6778f9d745-ft6gs" event={"ID":"44d77d08-ad9c-4524-8b12-3d9d204aaf1c","Type":"ContainerStarted","Data":"275691f37f6f67a2de209a5a1eaf8c7ee5473a0d43e2287e05d49c37f8c3fa41"} Oct 03 10:02:46 crc kubenswrapper[4990]: I1003 10:02:46.906555 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:02:47 crc kubenswrapper[4990]: I1003 10:02:47.717779 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:02:47 crc kubenswrapper[4990]: I1003 10:02:47.738943 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6778f9d745-ft6gs" podStartSLOduration=3.738927532 podStartE2EDuration="3.738927532s" podCreationTimestamp="2025-10-03 10:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:46.92674448 +0000 UTC m=+1148.723376347" watchObservedRunningTime="2025-10-03 10:02:47.738927532 +0000 UTC m=+1149.535559389" Oct 03 10:02:47 crc kubenswrapper[4990]: I1003 10:02:47.930269 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b682e49-2ca7-4692-b989-28dfbd26163e","Type":"ContainerStarted","Data":"3eac1b1500d053663746a028e055851b9a9cb1d22a20f4428e053e6a9d74b23f"} Oct 03 10:02:48 crc kubenswrapper[4990]: I1003 10:02:48.938167 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b682e49-2ca7-4692-b989-28dfbd26163e","Type":"ContainerStarted","Data":"5af4d58eb46b303caa40337b751ef9459fc70d7f1edca5370cef7c7ece2d18ae"} Oct 03 10:02:48 crc kubenswrapper[4990]: I1003 10:02:48.938400 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 10:02:48 crc kubenswrapper[4990]: I1003 10:02:48.968016 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.96799803 podStartE2EDuration="3.96799803s" podCreationTimestamp="2025-10-03 10:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:02:48.963150628 +0000 UTC m=+1150.759782495" watchObservedRunningTime="2025-10-03 10:02:48.96799803 +0000 UTC m=+1150.764629887" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.359470 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f5888695d-8qpvg" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:46616->10.217.0.157:9311: read: connection reset by peer" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.360099 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-f5888695d-8qpvg" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:46618->10.217.0.157:9311: read: connection reset by peer" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.531537 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.533059 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.534964 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.535102 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-tnwvg" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.535336 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.543436 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.661009 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config-secret\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.661081 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.661113 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpdnf\" (UniqueName: \"kubernetes.io/projected/a6db8725-5245-49dc-8bbc-9b8741622c42-kube-api-access-vpdnf\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.661129 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.762343 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config-secret\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.762398 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.762427 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpdnf\" (UniqueName: \"kubernetes.io/projected/a6db8725-5245-49dc-8bbc-9b8741622c42-kube-api-access-vpdnf\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.762445 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.763360 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.783943 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.784185 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config-secret\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.799042 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpdnf\" (UniqueName: \"kubernetes.io/projected/a6db8725-5245-49dc-8bbc-9b8741622c42-kube-api-access-vpdnf\") pod \"openstackclient\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " pod="openstack/openstackclient" Oct 03 10:02:49 crc kubenswrapper[4990]: I1003 10:02:49.858385 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.006101 4990 generic.go:334] "Generic (PLEG): container finished" podID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerID="5709db94fbf37881b7dcb88ca8ea55ff5d3a16e10fcac9dff43ec457d9265d4c" exitCode=0 Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.007830 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f5888695d-8qpvg" event={"ID":"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9","Type":"ContainerDied","Data":"5709db94fbf37881b7dcb88ca8ea55ff5d3a16e10fcac9dff43ec457d9265d4c"} Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.007865 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f5888695d-8qpvg" event={"ID":"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9","Type":"ContainerDied","Data":"e7b2e6a746f571b8c7ed003e023989026062e0b2154a0803e8a96ba8468d94f8"} Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.007878 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7b2e6a746f571b8c7ed003e023989026062e0b2154a0803e8a96ba8468d94f8" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.024587 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.072187 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data-custom\") pod \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.072270 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-combined-ca-bundle\") pod \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.072390 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfgmw\" (UniqueName: \"kubernetes.io/projected/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-kube-api-access-lfgmw\") pod \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.072490 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-logs\") pod \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.072615 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data\") pod \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\" (UID: \"5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.089727 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" (UID: "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.095953 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-logs" (OuterVolumeSpecName: "logs") pod "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" (UID: "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.106712 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-kube-api-access-lfgmw" (OuterVolumeSpecName: "kube-api-access-lfgmw") pod "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" (UID: "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9"). InnerVolumeSpecName "kube-api-access-lfgmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.122690 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.130873 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" (UID: "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.158244 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data" (OuterVolumeSpecName: "config-data") pod "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" (UID: "5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.175775 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfgmw\" (UniqueName: \"kubernetes.io/projected/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-kube-api-access-lfgmw\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.175823 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.175838 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.175851 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.175864 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.301718 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.365469 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78df67ddff-2zbr2"] Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.365721 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" podUID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerName="dnsmasq-dns" containerID="cri-o://c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700" gracePeriod=10 Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.390107 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.436816 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.577686 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" podUID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.921409 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.990452 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-swift-storage-0\") pod \"a06fc0e3-65b2-42fc-abac-e1774866d250\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.990500 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-nb\") pod \"a06fc0e3-65b2-42fc-abac-e1774866d250\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.990592 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-config\") pod \"a06fc0e3-65b2-42fc-abac-e1774866d250\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.990636 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvssv\" (UniqueName: \"kubernetes.io/projected/a06fc0e3-65b2-42fc-abac-e1774866d250-kube-api-access-tvssv\") pod \"a06fc0e3-65b2-42fc-abac-e1774866d250\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.990733 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-svc\") pod \"a06fc0e3-65b2-42fc-abac-e1774866d250\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.990854 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-sb\") pod \"a06fc0e3-65b2-42fc-abac-e1774866d250\" (UID: \"a06fc0e3-65b2-42fc-abac-e1774866d250\") " Oct 03 10:02:50 crc kubenswrapper[4990]: I1003 10:02:50.995750 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06fc0e3-65b2-42fc-abac-e1774866d250-kube-api-access-tvssv" (OuterVolumeSpecName: "kube-api-access-tvssv") pod "a06fc0e3-65b2-42fc-abac-e1774866d250" (UID: "a06fc0e3-65b2-42fc-abac-e1774866d250"). InnerVolumeSpecName "kube-api-access-tvssv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.025837 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a6db8725-5245-49dc-8bbc-9b8741622c42","Type":"ContainerStarted","Data":"ba9ce866b14c320911eed140afc2a3d6151cab89667d9fd7d50c4ab270092cc1"} Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.035923 4990 generic.go:334] "Generic (PLEG): container finished" podID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerID="c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700" exitCode=0 Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.037616 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.038628 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" event={"ID":"a06fc0e3-65b2-42fc-abac-e1774866d250","Type":"ContainerDied","Data":"c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700"} Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.040839 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78df67ddff-2zbr2" event={"ID":"a06fc0e3-65b2-42fc-abac-e1774866d250","Type":"ContainerDied","Data":"3c2decd74c81bfffbdebd8aa3dc6d2d51585be43cdb4c19e16f6c5d4d6cc828e"} Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.041045 4990 scope.go:117] "RemoveContainer" containerID="c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.041634 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f5888695d-8qpvg" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.075247 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a06fc0e3-65b2-42fc-abac-e1774866d250" (UID: "a06fc0e3-65b2-42fc-abac-e1774866d250"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.076681 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-config" (OuterVolumeSpecName: "config") pod "a06fc0e3-65b2-42fc-abac-e1774866d250" (UID: "a06fc0e3-65b2-42fc-abac-e1774866d250"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.085978 4990 scope.go:117] "RemoveContainer" containerID="77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.093976 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.094011 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.094023 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvssv\" (UniqueName: \"kubernetes.io/projected/a06fc0e3-65b2-42fc-abac-e1774866d250-kube-api-access-tvssv\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.099134 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.116588 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-f5888695d-8qpvg"] Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.117899 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a06fc0e3-65b2-42fc-abac-e1774866d250" (UID: "a06fc0e3-65b2-42fc-abac-e1774866d250"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.122448 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a06fc0e3-65b2-42fc-abac-e1774866d250" (UID: "a06fc0e3-65b2-42fc-abac-e1774866d250"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.124605 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a06fc0e3-65b2-42fc-abac-e1774866d250" (UID: "a06fc0e3-65b2-42fc-abac-e1774866d250"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.127606 4990 scope.go:117] "RemoveContainer" containerID="c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.128186 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-f5888695d-8qpvg"] Oct 03 10:02:51 crc kubenswrapper[4990]: E1003 10:02:51.131708 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700\": container with ID starting with c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700 not found: ID does not exist" containerID="c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.131750 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700"} err="failed to get container status \"c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700\": rpc error: code = NotFound desc = could not find container \"c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700\": container with ID starting with c2ce9352932fc21b6a31a8ff436a9a8d1542f14d3ee95d3a103f36907990e700 not found: ID does not exist" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.131775 4990 scope.go:117] "RemoveContainer" containerID="77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3" Oct 03 10:02:51 crc kubenswrapper[4990]: E1003 10:02:51.132085 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3\": container with ID starting with 77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3 not found: ID does not exist" containerID="77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.132109 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3"} err="failed to get container status \"77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3\": rpc error: code = NotFound desc = could not find container \"77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3\": container with ID starting with 77c870e99a2606d1ff3f52ad06e22966ffe0f310a553f1ffa9d1b19ba170c6a3 not found: ID does not exist" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.195986 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.196026 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.196038 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a06fc0e3-65b2-42fc-abac-e1774866d250-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.389936 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78df67ddff-2zbr2"] Oct 03 10:02:51 crc kubenswrapper[4990]: I1003 10:02:51.397327 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78df67ddff-2zbr2"] Oct 03 10:02:52 crc kubenswrapper[4990]: I1003 10:02:52.054660 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" containerName="cinder-scheduler" containerID="cri-o://e2e8bdef768843478c1ffdc2c53fe795c313188a7c0fb2bab2a6f29e6571ddf1" gracePeriod=30 Oct 03 10:02:52 crc kubenswrapper[4990]: I1003 10:02:52.054808 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" containerName="probe" containerID="cri-o://20b2317c5c0af2fb3a819cf7bdafebe6269a8914cfbc0ffd9b5469ad525d559c" gracePeriod=30 Oct 03 10:02:52 crc kubenswrapper[4990]: I1003 10:02:52.888409 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" path="/var/lib/kubelet/pods/5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9/volumes" Oct 03 10:02:52 crc kubenswrapper[4990]: I1003 10:02:52.890037 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06fc0e3-65b2-42fc-abac-e1774866d250" path="/var/lib/kubelet/pods/a06fc0e3-65b2-42fc-abac-e1774866d250/volumes" Oct 03 10:02:53 crc kubenswrapper[4990]: I1003 10:02:53.117190 4990 generic.go:334] "Generic (PLEG): container finished" podID="45912a4c-7d65-4d65-846d-03853f467cf6" containerID="20b2317c5c0af2fb3a819cf7bdafebe6269a8914cfbc0ffd9b5469ad525d559c" exitCode=0 Oct 03 10:02:53 crc kubenswrapper[4990]: I1003 10:02:53.117261 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45912a4c-7d65-4d65-846d-03853f467cf6","Type":"ContainerDied","Data":"20b2317c5c0af2fb3a819cf7bdafebe6269a8914cfbc0ffd9b5469ad525d559c"} Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.055538 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.055812 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-log" containerID="cri-o://e278b7414732cf851499cb8238bb151fdad6ab1335eedadf17655c20cf21385c" gracePeriod=30 Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.056173 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-httpd" containerID="cri-o://aa9a6c01fafc1b1871bc187c046c06c9b62a3ca43302bbdf8d59afe00dc56ff3" gracePeriod=30 Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.536547 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5fb98f794c-zgdcx"] Oct 03 10:02:54 crc kubenswrapper[4990]: E1003 10:02:54.538067 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerName="init" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.538188 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerName="init" Oct 03 10:02:54 crc kubenswrapper[4990]: E1003 10:02:54.538272 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api-log" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.538342 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api-log" Oct 03 10:02:54 crc kubenswrapper[4990]: E1003 10:02:54.538412 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.538479 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api" Oct 03 10:02:54 crc kubenswrapper[4990]: E1003 10:02:54.538596 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerName="dnsmasq-dns" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.538670 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerName="dnsmasq-dns" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.538991 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.539106 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06fc0e3-65b2-42fc-abac-e1774866d250" containerName="dnsmasq-dns" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.539194 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc68b94-34c1-4fe2-8b30-08b0fc7aaaf9" containerName="barbican-api-log" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.540398 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.545388 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.545673 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.547146 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.556703 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5fb98f794c-zgdcx"] Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.563361 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-public-tls-certs\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.563413 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-run-httpd\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.563437 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-log-httpd\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.563474 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-combined-ca-bundle\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.563540 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpwsf\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-kube-api-access-jpwsf\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.563572 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-internal-tls-certs\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.563608 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-etc-swift\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.563627 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-config-data\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665225 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-public-tls-certs\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665290 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-run-httpd\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665314 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-log-httpd\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665352 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-combined-ca-bundle\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665399 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpwsf\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-kube-api-access-jpwsf\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665427 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-internal-tls-certs\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665462 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-etc-swift\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665480 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-config-data\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.665861 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-run-httpd\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.666387 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-log-httpd\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.675622 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-config-data\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.677161 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-combined-ca-bundle\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.677472 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-etc-swift\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.677970 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-public-tls-certs\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.678073 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-internal-tls-certs\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.690845 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpwsf\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-kube-api-access-jpwsf\") pod \"swift-proxy-5fb98f794c-zgdcx\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.863291 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.994829 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.995169 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="ceilometer-central-agent" containerID="cri-o://ae9e95596a2988b574f8b75e1107b343823772356c008e5b6baacd114c7b5416" gracePeriod=30 Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.995321 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="proxy-httpd" containerID="cri-o://ec3d0a32dbd0c83f3fcf5c62037d956344b45f6170c5fafaaf3a7521e9e088e1" gracePeriod=30 Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.995382 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="sg-core" containerID="cri-o://aa25799031ab0399b9b2b58da350937c168f461b348e1ba770cf13f3f818ae7b" gracePeriod=30 Oct 03 10:02:54 crc kubenswrapper[4990]: I1003 10:02:54.995429 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="ceilometer-notification-agent" containerID="cri-o://baf1277b8ae44323ad33c43a230e061222265255663ebcfe00ca9283472b8d44" gracePeriod=30 Oct 03 10:02:55 crc kubenswrapper[4990]: I1003 10:02:55.012568 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.159:3000/\": EOF" Oct 03 10:02:55 crc kubenswrapper[4990]: I1003 10:02:55.139416 4990 generic.go:334] "Generic (PLEG): container finished" podID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerID="aa25799031ab0399b9b2b58da350937c168f461b348e1ba770cf13f3f818ae7b" exitCode=2 Oct 03 10:02:55 crc kubenswrapper[4990]: I1003 10:02:55.139480 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerDied","Data":"aa25799031ab0399b9b2b58da350937c168f461b348e1ba770cf13f3f818ae7b"} Oct 03 10:02:55 crc kubenswrapper[4990]: I1003 10:02:55.143137 4990 generic.go:334] "Generic (PLEG): container finished" podID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerID="e278b7414732cf851499cb8238bb151fdad6ab1335eedadf17655c20cf21385c" exitCode=143 Oct 03 10:02:55 crc kubenswrapper[4990]: I1003 10:02:55.143180 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4d40201-da82-4b6e-b9b1-acaab48c2885","Type":"ContainerDied","Data":"e278b7414732cf851499cb8238bb151fdad6ab1335eedadf17655c20cf21385c"} Oct 03 10:02:56 crc kubenswrapper[4990]: I1003 10:02:56.154339 4990 generic.go:334] "Generic (PLEG): container finished" podID="45912a4c-7d65-4d65-846d-03853f467cf6" containerID="e2e8bdef768843478c1ffdc2c53fe795c313188a7c0fb2bab2a6f29e6571ddf1" exitCode=0 Oct 03 10:02:56 crc kubenswrapper[4990]: I1003 10:02:56.154409 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45912a4c-7d65-4d65-846d-03853f467cf6","Type":"ContainerDied","Data":"e2e8bdef768843478c1ffdc2c53fe795c313188a7c0fb2bab2a6f29e6571ddf1"} Oct 03 10:02:56 crc kubenswrapper[4990]: I1003 10:02:56.158821 4990 generic.go:334] "Generic (PLEG): container finished" podID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerID="ec3d0a32dbd0c83f3fcf5c62037d956344b45f6170c5fafaaf3a7521e9e088e1" exitCode=0 Oct 03 10:02:56 crc kubenswrapper[4990]: I1003 10:02:56.158847 4990 generic.go:334] "Generic (PLEG): container finished" podID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerID="baf1277b8ae44323ad33c43a230e061222265255663ebcfe00ca9283472b8d44" exitCode=0 Oct 03 10:02:56 crc kubenswrapper[4990]: I1003 10:02:56.158855 4990 generic.go:334] "Generic (PLEG): container finished" podID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerID="ae9e95596a2988b574f8b75e1107b343823772356c008e5b6baacd114c7b5416" exitCode=0 Oct 03 10:02:56 crc kubenswrapper[4990]: I1003 10:02:56.158874 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerDied","Data":"ec3d0a32dbd0c83f3fcf5c62037d956344b45f6170c5fafaaf3a7521e9e088e1"} Oct 03 10:02:56 crc kubenswrapper[4990]: I1003 10:02:56.158902 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerDied","Data":"baf1277b8ae44323ad33c43a230e061222265255663ebcfe00ca9283472b8d44"} Oct 03 10:02:56 crc kubenswrapper[4990]: I1003 10:02:56.158913 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerDied","Data":"ae9e95596a2988b574f8b75e1107b343823772356c008e5b6baacd114c7b5416"} Oct 03 10:02:57 crc kubenswrapper[4990]: I1003 10:02:57.206728 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": read tcp 10.217.0.2:43578->10.217.0.150:9292: read: connection reset by peer" Oct 03 10:02:57 crc kubenswrapper[4990]: I1003 10:02:57.206743 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": read tcp 10.217.0.2:43592->10.217.0.150:9292: read: connection reset by peer" Oct 03 10:02:58 crc kubenswrapper[4990]: I1003 10:02:58.179382 4990 generic.go:334] "Generic (PLEG): container finished" podID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerID="aa9a6c01fafc1b1871bc187c046c06c9b62a3ca43302bbdf8d59afe00dc56ff3" exitCode=0 Oct 03 10:02:58 crc kubenswrapper[4990]: I1003 10:02:58.179434 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4d40201-da82-4b6e-b9b1-acaab48c2885","Type":"ContainerDied","Data":"aa9a6c01fafc1b1871bc187c046c06c9b62a3ca43302bbdf8d59afe00dc56ff3"} Oct 03 10:02:58 crc kubenswrapper[4990]: I1003 10:02:58.848112 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 10:03:01 crc kubenswrapper[4990]: I1003 10:03:01.760896 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:03:01 crc kubenswrapper[4990]: I1003 10:03:01.761962 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" containerName="glance-log" containerID="cri-o://bf7f01839d28dad19c4310b3009994bb5a5bcbc28458035b4b139da18b099d83" gracePeriod=30 Oct 03 10:03:01 crc kubenswrapper[4990]: I1003 10:03:01.762897 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" containerName="glance-httpd" containerID="cri-o://cd36141f435da361530acef5da9f4fb0c32f1ae7b9aa07b1f26b3c189bca420a" gracePeriod=30 Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.197757 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5fb98f794c-zgdcx"] Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.252655 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb98f794c-zgdcx" event={"ID":"41126aba-eb3b-4f29-89ab-29a3ea1addd9","Type":"ContainerStarted","Data":"430e8f0f9b1648a681ddd8b3527681286bb674db23ebbfc03d262e3c6cfbdb61"} Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.256282 4990 generic.go:334] "Generic (PLEG): container finished" podID="d9b7ab18-051d-4512-a87f-962750e82da6" containerID="bf7f01839d28dad19c4310b3009994bb5a5bcbc28458035b4b139da18b099d83" exitCode=143 Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.256600 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9b7ab18-051d-4512-a87f-962750e82da6","Type":"ContainerDied","Data":"bf7f01839d28dad19c4310b3009994bb5a5bcbc28458035b4b139da18b099d83"} Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.261689 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a6db8725-5245-49dc-8bbc-9b8741622c42","Type":"ContainerStarted","Data":"61366d13516cfa8aeb4e891ab816effe7f676485e8ed9d36f65358ce050b6251"} Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.313102 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.320388 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.830422262 podStartE2EDuration="13.320368095s" podCreationTimestamp="2025-10-03 10:02:49 +0000 UTC" firstStartedPulling="2025-10-03 10:02:50.39228412 +0000 UTC m=+1152.188915977" lastFinishedPulling="2025-10-03 10:03:01.882229953 +0000 UTC m=+1163.678861810" observedRunningTime="2025-10-03 10:03:02.279876997 +0000 UTC m=+1164.076508854" watchObservedRunningTime="2025-10-03 10:03:02.320368095 +0000 UTC m=+1164.116999952" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.322052 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425227 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45912a4c-7d65-4d65-846d-03853f467cf6-etc-machine-id\") pod \"45912a4c-7d65-4d65-846d-03853f467cf6\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425310 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-log-httpd\") pod \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425373 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data-custom\") pod \"45912a4c-7d65-4d65-846d-03853f467cf6\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425398 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2j7g\" (UniqueName: \"kubernetes.io/projected/45912a4c-7d65-4d65-846d-03853f467cf6-kube-api-access-d2j7g\") pod \"45912a4c-7d65-4d65-846d-03853f467cf6\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425442 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-combined-ca-bundle\") pod \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425493 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-scripts\") pod \"45912a4c-7d65-4d65-846d-03853f467cf6\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425547 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-scripts\") pod \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425631 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data\") pod \"45912a4c-7d65-4d65-846d-03853f467cf6\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425665 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hsr7\" (UniqueName: \"kubernetes.io/projected/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-kube-api-access-5hsr7\") pod \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425704 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-config-data\") pod \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425743 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-run-httpd\") pod \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425782 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-combined-ca-bundle\") pod \"45912a4c-7d65-4d65-846d-03853f467cf6\" (UID: \"45912a4c-7d65-4d65-846d-03853f467cf6\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.425884 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-sg-core-conf-yaml\") pod \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\" (UID: \"4f2d6e23-3e0a-48b2-8fbf-04975e20081a\") " Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.428830 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4f2d6e23-3e0a-48b2-8fbf-04975e20081a" (UID: "4f2d6e23-3e0a-48b2-8fbf-04975e20081a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.429075 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45912a4c-7d65-4d65-846d-03853f467cf6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "45912a4c-7d65-4d65-846d-03853f467cf6" (UID: "45912a4c-7d65-4d65-846d-03853f467cf6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.429376 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4f2d6e23-3e0a-48b2-8fbf-04975e20081a" (UID: "4f2d6e23-3e0a-48b2-8fbf-04975e20081a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.433815 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "45912a4c-7d65-4d65-846d-03853f467cf6" (UID: "45912a4c-7d65-4d65-846d-03853f467cf6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.438129 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-scripts" (OuterVolumeSpecName: "scripts") pod "45912a4c-7d65-4d65-846d-03853f467cf6" (UID: "45912a4c-7d65-4d65-846d-03853f467cf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.439669 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45912a4c-7d65-4d65-846d-03853f467cf6-kube-api-access-d2j7g" (OuterVolumeSpecName: "kube-api-access-d2j7g") pod "45912a4c-7d65-4d65-846d-03853f467cf6" (UID: "45912a4c-7d65-4d65-846d-03853f467cf6"). InnerVolumeSpecName "kube-api-access-d2j7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.439806 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-kube-api-access-5hsr7" (OuterVolumeSpecName: "kube-api-access-5hsr7") pod "4f2d6e23-3e0a-48b2-8fbf-04975e20081a" (UID: "4f2d6e23-3e0a-48b2-8fbf-04975e20081a"). InnerVolumeSpecName "kube-api-access-5hsr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.439883 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-scripts" (OuterVolumeSpecName: "scripts") pod "4f2d6e23-3e0a-48b2-8fbf-04975e20081a" (UID: "4f2d6e23-3e0a-48b2-8fbf-04975e20081a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.528337 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45912a4c-7d65-4d65-846d-03853f467cf6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.528369 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.528381 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.528391 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2j7g\" (UniqueName: \"kubernetes.io/projected/45912a4c-7d65-4d65-846d-03853f467cf6-kube-api-access-d2j7g\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.528403 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.528414 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.528425 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hsr7\" (UniqueName: \"kubernetes.io/projected/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-kube-api-access-5hsr7\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.528435 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.690158 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4f2d6e23-3e0a-48b2-8fbf-04975e20081a" (UID: "4f2d6e23-3e0a-48b2-8fbf-04975e20081a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.729721 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data" (OuterVolumeSpecName: "config-data") pod "45912a4c-7d65-4d65-846d-03853f467cf6" (UID: "45912a4c-7d65-4d65-846d-03853f467cf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.734057 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.734089 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.766090 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45912a4c-7d65-4d65-846d-03853f467cf6" (UID: "45912a4c-7d65-4d65-846d-03853f467cf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.811803 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-config-data" (OuterVolumeSpecName: "config-data") pod "4f2d6e23-3e0a-48b2-8fbf-04975e20081a" (UID: "4f2d6e23-3e0a-48b2-8fbf-04975e20081a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.818675 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f2d6e23-3e0a-48b2-8fbf-04975e20081a" (UID: "4f2d6e23-3e0a-48b2-8fbf-04975e20081a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.835337 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.835374 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2d6e23-3e0a-48b2-8fbf-04975e20081a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.835386 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45912a4c-7d65-4d65-846d-03853f467cf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:02 crc kubenswrapper[4990]: I1003 10:03:02.914613 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.039173 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-internal-tls-certs\") pod \"c4d40201-da82-4b6e-b9b1-acaab48c2885\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.039665 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-scripts\") pod \"c4d40201-da82-4b6e-b9b1-acaab48c2885\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.039788 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9pkx\" (UniqueName: \"kubernetes.io/projected/c4d40201-da82-4b6e-b9b1-acaab48c2885-kube-api-access-t9pkx\") pod \"c4d40201-da82-4b6e-b9b1-acaab48c2885\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.039851 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-httpd-run\") pod \"c4d40201-da82-4b6e-b9b1-acaab48c2885\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.039881 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-config-data\") pod \"c4d40201-da82-4b6e-b9b1-acaab48c2885\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.039903 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c4d40201-da82-4b6e-b9b1-acaab48c2885\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.039932 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-combined-ca-bundle\") pod \"c4d40201-da82-4b6e-b9b1-acaab48c2885\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.039959 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-logs\") pod \"c4d40201-da82-4b6e-b9b1-acaab48c2885\" (UID: \"c4d40201-da82-4b6e-b9b1-acaab48c2885\") " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.041018 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-logs" (OuterVolumeSpecName: "logs") pod "c4d40201-da82-4b6e-b9b1-acaab48c2885" (UID: "c4d40201-da82-4b6e-b9b1-acaab48c2885"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.041830 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c4d40201-da82-4b6e-b9b1-acaab48c2885" (UID: "c4d40201-da82-4b6e-b9b1-acaab48c2885"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.048959 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-scripts" (OuterVolumeSpecName: "scripts") pod "c4d40201-da82-4b6e-b9b1-acaab48c2885" (UID: "c4d40201-da82-4b6e-b9b1-acaab48c2885"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.049652 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "c4d40201-da82-4b6e-b9b1-acaab48c2885" (UID: "c4d40201-da82-4b6e-b9b1-acaab48c2885"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.054764 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d40201-da82-4b6e-b9b1-acaab48c2885-kube-api-access-t9pkx" (OuterVolumeSpecName: "kube-api-access-t9pkx") pod "c4d40201-da82-4b6e-b9b1-acaab48c2885" (UID: "c4d40201-da82-4b6e-b9b1-acaab48c2885"). InnerVolumeSpecName "kube-api-access-t9pkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.096489 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4d40201-da82-4b6e-b9b1-acaab48c2885" (UID: "c4d40201-da82-4b6e-b9b1-acaab48c2885"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.135591 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c4d40201-da82-4b6e-b9b1-acaab48c2885" (UID: "c4d40201-da82-4b6e-b9b1-acaab48c2885"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.142176 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.142223 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.142234 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.142246 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4d40201-da82-4b6e-b9b1-acaab48c2885-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.142255 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.142263 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.142271 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9pkx\" (UniqueName: \"kubernetes.io/projected/c4d40201-da82-4b6e-b9b1-acaab48c2885-kube-api-access-t9pkx\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.146639 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-config-data" (OuterVolumeSpecName: "config-data") pod "c4d40201-da82-4b6e-b9b1-acaab48c2885" (UID: "c4d40201-da82-4b6e-b9b1-acaab48c2885"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.159945 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.243431 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4d40201-da82-4b6e-b9b1-acaab48c2885-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.243769 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.272465 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c4d40201-da82-4b6e-b9b1-acaab48c2885","Type":"ContainerDied","Data":"e2ad86cae88692b3f40c6dc581acaebfeb64760fdba457409aa94676a1ae4d66"} Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.272500 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.272530 4990 scope.go:117] "RemoveContainer" containerID="aa9a6c01fafc1b1871bc187c046c06c9b62a3ca43302bbdf8d59afe00dc56ff3" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.275004 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"45912a4c-7d65-4d65-846d-03853f467cf6","Type":"ContainerDied","Data":"6e24a4a87789d269ec2d138266a19637eadd317c7266d497fd304bf3bcbc3d8e"} Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.275131 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.277115 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f2d6e23-3e0a-48b2-8fbf-04975e20081a","Type":"ContainerDied","Data":"1a734cf4fc7e1890b9881856f8140b9b653346554ac4b38da9d7344834c95eaa"} Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.277192 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.283911 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb98f794c-zgdcx" event={"ID":"41126aba-eb3b-4f29-89ab-29a3ea1addd9","Type":"ContainerStarted","Data":"abcf8fd5281bc2aae312ef1aab5f533fd6ed7860e53c537ab6d669fe5cb383bc"} Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.283947 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb98f794c-zgdcx" event={"ID":"41126aba-eb3b-4f29-89ab-29a3ea1addd9","Type":"ContainerStarted","Data":"fbd93aafb30841d9e18e1ff1f7a28d4ee830685ba26f421b05af7ea452b37f7c"} Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.302371 4990 scope.go:117] "RemoveContainer" containerID="e278b7414732cf851499cb8238bb151fdad6ab1335eedadf17655c20cf21385c" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.305277 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.313516 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.320737 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.321230 4990 scope.go:117] "RemoveContainer" containerID="20b2317c5c0af2fb3a819cf7bdafebe6269a8914cfbc0ffd9b5469ad525d559c" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.341999 4990 scope.go:117] "RemoveContainer" containerID="e2e8bdef768843478c1ffdc2c53fe795c313188a7c0fb2bab2a6f29e6571ddf1" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.342140 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.353855 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.371678 4990 scope.go:117] "RemoveContainer" containerID="ec3d0a32dbd0c83f3fcf5c62037d956344b45f6170c5fafaaf3a7521e9e088e1" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.378573 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: E1003 10:03:03.378957 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="proxy-httpd" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.378973 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="proxy-httpd" Oct 03 10:03:03 crc kubenswrapper[4990]: E1003 10:03:03.378986 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" containerName="probe" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.378993 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" containerName="probe" Oct 03 10:03:03 crc kubenswrapper[4990]: E1003 10:03:03.379009 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="sg-core" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379017 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="sg-core" Oct 03 10:03:03 crc kubenswrapper[4990]: E1003 10:03:03.379029 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-log" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379034 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-log" Oct 03 10:03:03 crc kubenswrapper[4990]: E1003 10:03:03.379050 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" containerName="cinder-scheduler" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379055 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" containerName="cinder-scheduler" Oct 03 10:03:03 crc kubenswrapper[4990]: E1003 10:03:03.379070 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="ceilometer-notification-agent" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379077 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="ceilometer-notification-agent" Oct 03 10:03:03 crc kubenswrapper[4990]: E1003 10:03:03.379086 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="ceilometer-central-agent" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379093 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="ceilometer-central-agent" Oct 03 10:03:03 crc kubenswrapper[4990]: E1003 10:03:03.379105 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-httpd" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379111 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-httpd" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379287 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="ceilometer-central-agent" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379298 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="sg-core" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379308 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="proxy-httpd" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379320 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" containerName="probe" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379332 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-log" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379338 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" containerName="cinder-scheduler" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379348 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" containerName="glance-httpd" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.379359 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" containerName="ceilometer-notification-agent" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.380861 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.385775 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.385938 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.414436 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.421565 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.423171 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.424158 4990 scope.go:117] "RemoveContainer" containerID="aa25799031ab0399b9b2b58da350937c168f461b348e1ba770cf13f3f818ae7b" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.427836 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.428863 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.449572 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-config-data\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.449837 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.449922 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trzf5\" (UniqueName: \"kubernetes.io/projected/0baa8292-5637-4d99-ae4c-7246010f1b84-kube-api-access-trzf5\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.449951 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-scripts\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.449996 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-log-httpd\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.450023 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.450040 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-run-httpd\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.460256 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.463350 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.465737 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.468938 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.474102 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.483365 4990 scope.go:117] "RemoveContainer" containerID="baf1277b8ae44323ad33c43a230e061222265255663ebcfe00ca9283472b8d44" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.489816 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5fb98f794c-zgdcx" podStartSLOduration=9.489795973 podStartE2EDuration="9.489795973s" podCreationTimestamp="2025-10-03 10:02:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:03:03.359653589 +0000 UTC m=+1165.156285446" watchObservedRunningTime="2025-10-03 10:03:03.489795973 +0000 UTC m=+1165.286427830" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.501568 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.550974 4990 scope.go:117] "RemoveContainer" containerID="ae9e95596a2988b574f8b75e1107b343823772356c008e5b6baacd114c7b5416" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.553312 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhcgf\" (UniqueName: \"kubernetes.io/projected/4064799a-3601-4426-a225-151729d11c97-kube-api-access-dhcgf\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.553384 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-config-data\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.553465 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555254 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555313 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555340 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-scripts\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555380 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555434 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-logs\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555480 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555646 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555670 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555705 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555727 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4064799a-3601-4426-a225-151729d11c97-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555752 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzts\" (UniqueName: \"kubernetes.io/projected/971e9963-b7ee-4ee8-872a-2f696bbfdb40-kube-api-access-7vzts\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555768 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-scripts\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555798 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-config-data\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555865 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trzf5\" (UniqueName: \"kubernetes.io/projected/0baa8292-5637-4d99-ae4c-7246010f1b84-kube-api-access-trzf5\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555910 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-scripts\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555944 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-log-httpd\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555969 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.555987 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-run-httpd\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.556447 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-run-httpd\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.560137 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-log-httpd\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.560592 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.561124 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-scripts\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.565665 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.563997 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-config-data\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.576707 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trzf5\" (UniqueName: \"kubernetes.io/projected/0baa8292-5637-4d99-ae4c-7246010f1b84-kube-api-access-trzf5\") pod \"ceilometer-0\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657421 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657492 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657610 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4064799a-3601-4426-a225-151729d11c97-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657640 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzts\" (UniqueName: \"kubernetes.io/projected/971e9963-b7ee-4ee8-872a-2f696bbfdb40-kube-api-access-7vzts\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657664 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-scripts\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657689 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-config-data\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657767 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhcgf\" (UniqueName: \"kubernetes.io/projected/4064799a-3601-4426-a225-151729d11c97-kube-api-access-dhcgf\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657812 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657835 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657862 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657884 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-scripts\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657912 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657941 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-logs\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.657973 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.658501 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.659217 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4064799a-3601-4426-a225-151729d11c97-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.659753 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.661642 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-logs\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.663620 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-config-data\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.665571 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.668852 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-scripts\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.671136 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.674323 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.675831 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.677230 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-scripts\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.680926 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.682989 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzts\" (UniqueName: \"kubernetes.io/projected/971e9963-b7ee-4ee8-872a-2f696bbfdb40-kube-api-access-7vzts\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.707002 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhcgf\" (UniqueName: \"kubernetes.io/projected/4064799a-3601-4426-a225-151729d11c97-kube-api-access-dhcgf\") pod \"cinder-scheduler-0\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.708556 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.713233 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " pod="openstack/glance-default-internal-api-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.754932 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 10:03:03 crc kubenswrapper[4990]: I1003 10:03:03.783385 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.192345 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.297136 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerStarted","Data":"f51cf55ac0b76f4e518d7b35ad5d1ab7bbadb723261dbb26d8a67f493b2c1ea9"} Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.297367 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.297414 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:03:04 crc kubenswrapper[4990]: W1003 10:03:04.395760 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4064799a_3601_4426_a225_151729d11c97.slice/crio-75b7dd6289c25347a32f40d5346be61b68d580ba8e6cf62df4948b9e821f6bbb WatchSource:0}: Error finding container 75b7dd6289c25347a32f40d5346be61b68d580ba8e6cf62df4948b9e821f6bbb: Status 404 returned error can't find the container with id 75b7dd6289c25347a32f40d5346be61b68d580ba8e6cf62df4948b9e821f6bbb Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.397265 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.465879 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.886223 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45912a4c-7d65-4d65-846d-03853f467cf6" path="/var/lib/kubelet/pods/45912a4c-7d65-4d65-846d-03853f467cf6/volumes" Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.887169 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2d6e23-3e0a-48b2-8fbf-04975e20081a" path="/var/lib/kubelet/pods/4f2d6e23-3e0a-48b2-8fbf-04975e20081a/volumes" Oct 03 10:03:04 crc kubenswrapper[4990]: I1003 10:03:04.891765 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d40201-da82-4b6e-b9b1-acaab48c2885" path="/var/lib/kubelet/pods/c4d40201-da82-4b6e-b9b1-acaab48c2885/volumes" Oct 03 10:03:05 crc kubenswrapper[4990]: I1003 10:03:05.315284 4990 generic.go:334] "Generic (PLEG): container finished" podID="d9b7ab18-051d-4512-a87f-962750e82da6" containerID="cd36141f435da361530acef5da9f4fb0c32f1ae7b9aa07b1f26b3c189bca420a" exitCode=0 Oct 03 10:03:05 crc kubenswrapper[4990]: I1003 10:03:05.315687 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9b7ab18-051d-4512-a87f-962750e82da6","Type":"ContainerDied","Data":"cd36141f435da361530acef5da9f4fb0c32f1ae7b9aa07b1f26b3c189bca420a"} Oct 03 10:03:05 crc kubenswrapper[4990]: I1003 10:03:05.318015 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971e9963-b7ee-4ee8-872a-2f696bbfdb40","Type":"ContainerStarted","Data":"2320cf1b112fb76ae26987d386aaedb7324559fc85344204f40758a4889f9ef5"} Oct 03 10:03:05 crc kubenswrapper[4990]: I1003 10:03:05.320685 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4064799a-3601-4426-a225-151729d11c97","Type":"ContainerStarted","Data":"75b7dd6289c25347a32f40d5346be61b68d580ba8e6cf62df4948b9e821f6bbb"} Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.353287 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerStarted","Data":"502b9a05d649b3d317af1234a81ded4c49063df93ac40a282cdb421bb1ed6a5b"} Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.357426 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971e9963-b7ee-4ee8-872a-2f696bbfdb40","Type":"ContainerStarted","Data":"b9379e1b8b4ae7edb841ea4eecf2f3729d7382db2b0cb9c51bafb1386c50c7f8"} Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.373461 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4064799a-3601-4426-a225-151729d11c97","Type":"ContainerStarted","Data":"20d2d25ce9bcc245f400e40ad8c0300acef06aacb84e0cd069565068a37a1714"} Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.578983 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.602925 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.638133 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-config-data\") pod \"d9b7ab18-051d-4512-a87f-962750e82da6\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.638205 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d9b7ab18-051d-4512-a87f-962750e82da6\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.638243 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwks9\" (UniqueName: \"kubernetes.io/projected/d9b7ab18-051d-4512-a87f-962750e82da6-kube-api-access-gwks9\") pod \"d9b7ab18-051d-4512-a87f-962750e82da6\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.638278 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-httpd-run\") pod \"d9b7ab18-051d-4512-a87f-962750e82da6\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.638561 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-public-tls-certs\") pod \"d9b7ab18-051d-4512-a87f-962750e82da6\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.638600 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-logs\") pod \"d9b7ab18-051d-4512-a87f-962750e82da6\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.638685 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-scripts\") pod \"d9b7ab18-051d-4512-a87f-962750e82da6\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.638749 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-combined-ca-bundle\") pod \"d9b7ab18-051d-4512-a87f-962750e82da6\" (UID: \"d9b7ab18-051d-4512-a87f-962750e82da6\") " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.641107 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d9b7ab18-051d-4512-a87f-962750e82da6" (UID: "d9b7ab18-051d-4512-a87f-962750e82da6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.656792 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-logs" (OuterVolumeSpecName: "logs") pod "d9b7ab18-051d-4512-a87f-962750e82da6" (UID: "d9b7ab18-051d-4512-a87f-962750e82da6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.660081 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "d9b7ab18-051d-4512-a87f-962750e82da6" (UID: "d9b7ab18-051d-4512-a87f-962750e82da6"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.691806 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b7ab18-051d-4512-a87f-962750e82da6-kube-api-access-gwks9" (OuterVolumeSpecName: "kube-api-access-gwks9") pod "d9b7ab18-051d-4512-a87f-962750e82da6" (UID: "d9b7ab18-051d-4512-a87f-962750e82da6"). InnerVolumeSpecName "kube-api-access-gwks9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.691925 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-scripts" (OuterVolumeSpecName: "scripts") pod "d9b7ab18-051d-4512-a87f-962750e82da6" (UID: "d9b7ab18-051d-4512-a87f-962750e82da6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.697640 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9b7ab18-051d-4512-a87f-962750e82da6" (UID: "d9b7ab18-051d-4512-a87f-962750e82da6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.729797 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-config-data" (OuterVolumeSpecName: "config-data") pod "d9b7ab18-051d-4512-a87f-962750e82da6" (UID: "d9b7ab18-051d-4512-a87f-962750e82da6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.740990 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.741031 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.741045 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.741057 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.741085 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.741096 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwks9\" (UniqueName: \"kubernetes.io/projected/d9b7ab18-051d-4512-a87f-962750e82da6-kube-api-access-gwks9\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.741110 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9b7ab18-051d-4512-a87f-962750e82da6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.762899 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d9b7ab18-051d-4512-a87f-962750e82da6" (UID: "d9b7ab18-051d-4512-a87f-962750e82da6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.769312 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.843145 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9b7ab18-051d-4512-a87f-962750e82da6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:07 crc kubenswrapper[4990]: I1003 10:03:07.843177 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.291477 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.390789 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971e9963-b7ee-4ee8-872a-2f696bbfdb40","Type":"ContainerStarted","Data":"de83fa39f0019c8400c38ca9ffa8825e384c428252e8724d9b3c68ce695b214d"} Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.395817 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9b7ab18-051d-4512-a87f-962750e82da6","Type":"ContainerDied","Data":"a7d2ad1e3bcf8443837ef75df589d4bd3b46804abe751448f8c7636cfab87696"} Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.395865 4990 scope.go:117] "RemoveContainer" containerID="cd36141f435da361530acef5da9f4fb0c32f1ae7b9aa07b1f26b3c189bca420a" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.395996 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.416676 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.416660704 podStartE2EDuration="5.416660704s" podCreationTimestamp="2025-10-03 10:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:03:08.415011873 +0000 UTC m=+1170.211643730" watchObservedRunningTime="2025-10-03 10:03:08.416660704 +0000 UTC m=+1170.213292561" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.426579 4990 scope.go:117] "RemoveContainer" containerID="bf7f01839d28dad19c4310b3009994bb5a5bcbc28458035b4b139da18b099d83" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.441563 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.449292 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.497467 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:03:08 crc kubenswrapper[4990]: E1003 10:03:08.497960 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" containerName="glance-httpd" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.497980 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" containerName="glance-httpd" Oct 03 10:03:08 crc kubenswrapper[4990]: E1003 10:03:08.498019 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" containerName="glance-log" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.498026 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" containerName="glance-log" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.498200 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" containerName="glance-httpd" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.498227 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" containerName="glance-log" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.499121 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.506052 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.506545 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.506965 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.567872 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-config-data\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.567923 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.568063 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-logs\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.568100 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.568263 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwzwb\" (UniqueName: \"kubernetes.io/projected/afea80e6-894d-41cd-b107-926d012e9f35-kube-api-access-dwzwb\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.568378 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.568403 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.568522 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-scripts\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670394 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670704 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670745 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-scripts\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670775 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-config-data\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670801 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670858 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-logs\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670877 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670927 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwzwb\" (UniqueName: \"kubernetes.io/projected/afea80e6-894d-41cd-b107-926d012e9f35-kube-api-access-dwzwb\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.670927 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.671300 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.672460 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-logs\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.679306 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.684046 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-config-data\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.685138 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-scripts\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.690602 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.690946 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwzwb\" (UniqueName: \"kubernetes.io/projected/afea80e6-894d-41cd-b107-926d012e9f35-kube-api-access-dwzwb\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.747974 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.819337 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:03:08 crc kubenswrapper[4990]: I1003 10:03:08.930850 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b7ab18-051d-4512-a87f-962750e82da6" path="/var/lib/kubelet/pods/d9b7ab18-051d-4512-a87f-962750e82da6/volumes" Oct 03 10:03:09 crc kubenswrapper[4990]: I1003 10:03:09.411737 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4064799a-3601-4426-a225-151729d11c97","Type":"ContainerStarted","Data":"603eafaafed14cad283cbaa3059780f5a1da6d31cc82934c8ad3431436581baf"} Oct 03 10:03:09 crc kubenswrapper[4990]: I1003 10:03:09.446581 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.446559313 podStartE2EDuration="6.446559313s" podCreationTimestamp="2025-10-03 10:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:03:09.43688701 +0000 UTC m=+1171.233518887" watchObservedRunningTime="2025-10-03 10:03:09.446559313 +0000 UTC m=+1171.243191170" Oct 03 10:03:09 crc kubenswrapper[4990]: I1003 10:03:09.485349 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:03:09 crc kubenswrapper[4990]: I1003 10:03:09.882828 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:03:09 crc kubenswrapper[4990]: I1003 10:03:09.883302 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:03:10 crc kubenswrapper[4990]: I1003 10:03:10.423576 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afea80e6-894d-41cd-b107-926d012e9f35","Type":"ContainerStarted","Data":"15879b52a5447c4f1318e71b94d644d6fa2c2d384c4a618bb98821e956651178"} Oct 03 10:03:10 crc kubenswrapper[4990]: I1003 10:03:10.423623 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afea80e6-894d-41cd-b107-926d012e9f35","Type":"ContainerStarted","Data":"580bc6f927ad111e74705a2be9063fdd1335fba97dc16dee485218c69c8e7593"} Oct 03 10:03:10 crc kubenswrapper[4990]: I1003 10:03:10.430237 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerStarted","Data":"7e9f4eb4da8f4f56f5653f39b3a9a4d90d9400f1d0dfd515a6f59c965a1a5891"} Oct 03 10:03:11 crc kubenswrapper[4990]: I1003 10:03:11.446976 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afea80e6-894d-41cd-b107-926d012e9f35","Type":"ContainerStarted","Data":"88d641d1bf486138095cda080880426b764b3efa9d3577891d4ca40c95c3393d"} Oct 03 10:03:11 crc kubenswrapper[4990]: I1003 10:03:11.452459 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerStarted","Data":"4b3af573f7585dcc48094b23b1a698decacc9f5185523abc523b6c1a3ab047ff"} Oct 03 10:03:11 crc kubenswrapper[4990]: I1003 10:03:11.473985 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.473962374 podStartE2EDuration="3.473962374s" podCreationTimestamp="2025-10-03 10:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:03:11.466679121 +0000 UTC m=+1173.263310998" watchObservedRunningTime="2025-10-03 10:03:11.473962374 +0000 UTC m=+1173.270594231" Oct 03 10:03:12 crc kubenswrapper[4990]: I1003 10:03:12.464559 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerStarted","Data":"764711fd228e032dd915a7cdce28e2a55de7fdb51138c5ba3100b7652167a394"} Oct 03 10:03:12 crc kubenswrapper[4990]: I1003 10:03:12.464932 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="ceilometer-central-agent" containerID="cri-o://502b9a05d649b3d317af1234a81ded4c49063df93ac40a282cdb421bb1ed6a5b" gracePeriod=30 Oct 03 10:03:12 crc kubenswrapper[4990]: I1003 10:03:12.464994 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="proxy-httpd" containerID="cri-o://764711fd228e032dd915a7cdce28e2a55de7fdb51138c5ba3100b7652167a394" gracePeriod=30 Oct 03 10:03:12 crc kubenswrapper[4990]: I1003 10:03:12.465042 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 10:03:12 crc kubenswrapper[4990]: I1003 10:03:12.465042 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="ceilometer-notification-agent" containerID="cri-o://7e9f4eb4da8f4f56f5653f39b3a9a4d90d9400f1d0dfd515a6f59c965a1a5891" gracePeriod=30 Oct 03 10:03:12 crc kubenswrapper[4990]: I1003 10:03:12.465042 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="sg-core" containerID="cri-o://4b3af573f7585dcc48094b23b1a698decacc9f5185523abc523b6c1a3ab047ff" gracePeriod=30 Oct 03 10:03:12 crc kubenswrapper[4990]: I1003 10:03:12.495108 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9232014020000001 podStartE2EDuration="9.495091352s" podCreationTimestamp="2025-10-03 10:03:03 +0000 UTC" firstStartedPulling="2025-10-03 10:03:04.194628204 +0000 UTC m=+1165.991260061" lastFinishedPulling="2025-10-03 10:03:11.766518154 +0000 UTC m=+1173.563150011" observedRunningTime="2025-10-03 10:03:12.488974248 +0000 UTC m=+1174.285606105" watchObservedRunningTime="2025-10-03 10:03:12.495091352 +0000 UTC m=+1174.291723209" Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.475750 4990 generic.go:334] "Generic (PLEG): container finished" podID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerID="764711fd228e032dd915a7cdce28e2a55de7fdb51138c5ba3100b7652167a394" exitCode=0 Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.476062 4990 generic.go:334] "Generic (PLEG): container finished" podID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerID="4b3af573f7585dcc48094b23b1a698decacc9f5185523abc523b6c1a3ab047ff" exitCode=2 Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.476070 4990 generic.go:334] "Generic (PLEG): container finished" podID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerID="7e9f4eb4da8f4f56f5653f39b3a9a4d90d9400f1d0dfd515a6f59c965a1a5891" exitCode=0 Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.475833 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerDied","Data":"764711fd228e032dd915a7cdce28e2a55de7fdb51138c5ba3100b7652167a394"} Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.476124 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerDied","Data":"4b3af573f7585dcc48094b23b1a698decacc9f5185523abc523b6c1a3ab047ff"} Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.476144 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerDied","Data":"7e9f4eb4da8f4f56f5653f39b3a9a4d90d9400f1d0dfd515a6f59c965a1a5891"} Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.755842 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.783871 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.783917 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.821739 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:13 crc kubenswrapper[4990]: I1003 10:03:13.834977 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:14 crc kubenswrapper[4990]: I1003 10:03:14.074074 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 10:03:14 crc kubenswrapper[4990]: I1003 10:03:14.485380 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:14 crc kubenswrapper[4990]: I1003 10:03:14.485442 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:14 crc kubenswrapper[4990]: I1003 10:03:14.933906 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:03:15 crc kubenswrapper[4990]: I1003 10:03:15.002016 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-669bd9dbd-82v7b"] Oct 03 10:03:15 crc kubenswrapper[4990]: I1003 10:03:15.002570 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-669bd9dbd-82v7b" podUID="5adde993-f568-41c7-918c-4b03eb25e560" containerName="neutron-api" containerID="cri-o://9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654" gracePeriod=30 Oct 03 10:03:15 crc kubenswrapper[4990]: I1003 10:03:15.002719 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-669bd9dbd-82v7b" podUID="5adde993-f568-41c7-918c-4b03eb25e560" containerName="neutron-httpd" containerID="cri-o://fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6" gracePeriod=30 Oct 03 10:03:15 crc kubenswrapper[4990]: I1003 10:03:15.496212 4990 generic.go:334] "Generic (PLEG): container finished" podID="5adde993-f568-41c7-918c-4b03eb25e560" containerID="fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6" exitCode=0 Oct 03 10:03:15 crc kubenswrapper[4990]: I1003 10:03:15.496266 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669bd9dbd-82v7b" event={"ID":"5adde993-f568-41c7-918c-4b03eb25e560","Type":"ContainerDied","Data":"fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6"} Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.520343 4990 generic.go:334] "Generic (PLEG): container finished" podID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerID="502b9a05d649b3d317af1234a81ded4c49063df93ac40a282cdb421bb1ed6a5b" exitCode=0 Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.520401 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerDied","Data":"502b9a05d649b3d317af1234a81ded4c49063df93ac40a282cdb421bb1ed6a5b"} Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.680644 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.680785 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.723123 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.853195 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.933658 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-config-data\") pod \"0baa8292-5637-4d99-ae4c-7246010f1b84\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.933701 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-run-httpd\") pod \"0baa8292-5637-4d99-ae4c-7246010f1b84\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.933721 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-sg-core-conf-yaml\") pod \"0baa8292-5637-4d99-ae4c-7246010f1b84\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.933788 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-log-httpd\") pod \"0baa8292-5637-4d99-ae4c-7246010f1b84\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.933837 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trzf5\" (UniqueName: \"kubernetes.io/projected/0baa8292-5637-4d99-ae4c-7246010f1b84-kube-api-access-trzf5\") pod \"0baa8292-5637-4d99-ae4c-7246010f1b84\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.933901 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-combined-ca-bundle\") pod \"0baa8292-5637-4d99-ae4c-7246010f1b84\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.933952 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-scripts\") pod \"0baa8292-5637-4d99-ae4c-7246010f1b84\" (UID: \"0baa8292-5637-4d99-ae4c-7246010f1b84\") " Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.934281 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0baa8292-5637-4d99-ae4c-7246010f1b84" (UID: "0baa8292-5637-4d99-ae4c-7246010f1b84"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.934416 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0baa8292-5637-4d99-ae4c-7246010f1b84" (UID: "0baa8292-5637-4d99-ae4c-7246010f1b84"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.934462 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.956342 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0baa8292-5637-4d99-ae4c-7246010f1b84-kube-api-access-trzf5" (OuterVolumeSpecName: "kube-api-access-trzf5") pod "0baa8292-5637-4d99-ae4c-7246010f1b84" (UID: "0baa8292-5637-4d99-ae4c-7246010f1b84"). InnerVolumeSpecName "kube-api-access-trzf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.957673 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-scripts" (OuterVolumeSpecName: "scripts") pod "0baa8292-5637-4d99-ae4c-7246010f1b84" (UID: "0baa8292-5637-4d99-ae4c-7246010f1b84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:16 crc kubenswrapper[4990]: I1003 10:03:16.964246 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0baa8292-5637-4d99-ae4c-7246010f1b84" (UID: "0baa8292-5637-4d99-ae4c-7246010f1b84"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.024645 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0baa8292-5637-4d99-ae4c-7246010f1b84" (UID: "0baa8292-5637-4d99-ae4c-7246010f1b84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.036128 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.036160 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0baa8292-5637-4d99-ae4c-7246010f1b84-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.036170 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trzf5\" (UniqueName: \"kubernetes.io/projected/0baa8292-5637-4d99-ae4c-7246010f1b84-kube-api-access-trzf5\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.036181 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.036190 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.056592 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-config-data" (OuterVolumeSpecName: "config-data") pod "0baa8292-5637-4d99-ae4c-7246010f1b84" (UID: "0baa8292-5637-4d99-ae4c-7246010f1b84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.137798 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baa8292-5637-4d99-ae4c-7246010f1b84-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.531224 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.531495 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0baa8292-5637-4d99-ae4c-7246010f1b84","Type":"ContainerDied","Data":"f51cf55ac0b76f4e518d7b35ad5d1ab7bbadb723261dbb26d8a67f493b2c1ea9"} Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.531719 4990 scope.go:117] "RemoveContainer" containerID="764711fd228e032dd915a7cdce28e2a55de7fdb51138c5ba3100b7652167a394" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.563858 4990 scope.go:117] "RemoveContainer" containerID="4b3af573f7585dcc48094b23b1a698decacc9f5185523abc523b6c1a3ab047ff" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.567842 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.585673 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.605866 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606259 4990 scope.go:117] "RemoveContainer" containerID="7e9f4eb4da8f4f56f5653f39b3a9a4d90d9400f1d0dfd515a6f59c965a1a5891" Oct 03 10:03:17 crc kubenswrapper[4990]: E1003 10:03:17.606317 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="proxy-httpd" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606335 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="proxy-httpd" Oct 03 10:03:17 crc kubenswrapper[4990]: E1003 10:03:17.606354 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="sg-core" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606362 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="sg-core" Oct 03 10:03:17 crc kubenswrapper[4990]: E1003 10:03:17.606390 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="ceilometer-notification-agent" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606398 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="ceilometer-notification-agent" Oct 03 10:03:17 crc kubenswrapper[4990]: E1003 10:03:17.606423 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="ceilometer-central-agent" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606431 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="ceilometer-central-agent" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606646 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="proxy-httpd" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606662 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="ceilometer-notification-agent" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606683 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="sg-core" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.606697 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" containerName="ceilometer-central-agent" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.608229 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.611469 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.611818 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.624088 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.665670 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.665715 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks6l7\" (UniqueName: \"kubernetes.io/projected/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-kube-api-access-ks6l7\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.665793 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.665831 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-scripts\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.665867 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-config-data\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.665914 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-run-httpd\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.665942 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-log-httpd\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.677067 4990 scope.go:117] "RemoveContainer" containerID="502b9a05d649b3d317af1234a81ded4c49063df93ac40a282cdb421bb1ed6a5b" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.767363 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.767416 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks6l7\" (UniqueName: \"kubernetes.io/projected/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-kube-api-access-ks6l7\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.767544 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.767585 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-scripts\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.767629 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-config-data\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.767698 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-run-httpd\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.767823 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-log-httpd\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.768379 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-log-httpd\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.769187 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-run-httpd\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.774069 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.774178 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-scripts\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.774637 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.781189 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-config-data\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.798901 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks6l7\" (UniqueName: \"kubernetes.io/projected/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-kube-api-access-ks6l7\") pod \"ceilometer-0\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " pod="openstack/ceilometer-0" Oct 03 10:03:17 crc kubenswrapper[4990]: I1003 10:03:17.993338 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:18 crc kubenswrapper[4990]: I1003 10:03:18.478615 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:18 crc kubenswrapper[4990]: W1003 10:03:18.481006 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fd35e33_5f62_49a8_bea2_3c3bd8ec01ec.slice/crio-b731dc3a3cf0bfda768c57a27171ad6a6e285c567851df525cf85aacbbcd2e97 WatchSource:0}: Error finding container b731dc3a3cf0bfda768c57a27171ad6a6e285c567851df525cf85aacbbcd2e97: Status 404 returned error can't find the container with id b731dc3a3cf0bfda768c57a27171ad6a6e285c567851df525cf85aacbbcd2e97 Oct 03 10:03:18 crc kubenswrapper[4990]: I1003 10:03:18.540905 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerStarted","Data":"b731dc3a3cf0bfda768c57a27171ad6a6e285c567851df525cf85aacbbcd2e97"} Oct 03 10:03:18 crc kubenswrapper[4990]: I1003 10:03:18.820741 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 10:03:18 crc kubenswrapper[4990]: I1003 10:03:18.821122 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 10:03:18 crc kubenswrapper[4990]: I1003 10:03:18.864098 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 10:03:18 crc kubenswrapper[4990]: I1003 10:03:18.888094 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0baa8292-5637-4d99-ae4c-7246010f1b84" path="/var/lib/kubelet/pods/0baa8292-5637-4d99-ae4c-7246010f1b84/volumes" Oct 03 10:03:18 crc kubenswrapper[4990]: I1003 10:03:18.897337 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.155119 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.316562 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-config\") pod \"5adde993-f568-41c7-918c-4b03eb25e560\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.316841 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-combined-ca-bundle\") pod \"5adde993-f568-41c7-918c-4b03eb25e560\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.317038 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-ovndb-tls-certs\") pod \"5adde993-f568-41c7-918c-4b03eb25e560\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.317191 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v868v\" (UniqueName: \"kubernetes.io/projected/5adde993-f568-41c7-918c-4b03eb25e560-kube-api-access-v868v\") pod \"5adde993-f568-41c7-918c-4b03eb25e560\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.317364 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-httpd-config\") pod \"5adde993-f568-41c7-918c-4b03eb25e560\" (UID: \"5adde993-f568-41c7-918c-4b03eb25e560\") " Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.322613 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5adde993-f568-41c7-918c-4b03eb25e560" (UID: "5adde993-f568-41c7-918c-4b03eb25e560"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.323921 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5adde993-f568-41c7-918c-4b03eb25e560-kube-api-access-v868v" (OuterVolumeSpecName: "kube-api-access-v868v") pod "5adde993-f568-41c7-918c-4b03eb25e560" (UID: "5adde993-f568-41c7-918c-4b03eb25e560"). InnerVolumeSpecName "kube-api-access-v868v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.369492 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-config" (OuterVolumeSpecName: "config") pod "5adde993-f568-41c7-918c-4b03eb25e560" (UID: "5adde993-f568-41c7-918c-4b03eb25e560"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.391056 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5adde993-f568-41c7-918c-4b03eb25e560" (UID: "5adde993-f568-41c7-918c-4b03eb25e560"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.402213 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5adde993-f568-41c7-918c-4b03eb25e560" (UID: "5adde993-f568-41c7-918c-4b03eb25e560"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.419651 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.419701 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.419715 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.419735 4990 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5adde993-f568-41c7-918c-4b03eb25e560-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.419749 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v868v\" (UniqueName: \"kubernetes.io/projected/5adde993-f568-41c7-918c-4b03eb25e560-kube-api-access-v868v\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.556642 4990 generic.go:334] "Generic (PLEG): container finished" podID="5adde993-f568-41c7-918c-4b03eb25e560" containerID="9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654" exitCode=0 Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.556719 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-669bd9dbd-82v7b" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.556768 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669bd9dbd-82v7b" event={"ID":"5adde993-f568-41c7-918c-4b03eb25e560","Type":"ContainerDied","Data":"9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654"} Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.556807 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669bd9dbd-82v7b" event={"ID":"5adde993-f568-41c7-918c-4b03eb25e560","Type":"ContainerDied","Data":"ae55941da89fe54a2ad7d0e69456103989d0c9ef6e19f7d5a41847fcb4549e5c"} Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.556834 4990 scope.go:117] "RemoveContainer" containerID="fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.557488 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.557524 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.622436 4990 scope.go:117] "RemoveContainer" containerID="9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.649717 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-669bd9dbd-82v7b"] Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.661337 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-669bd9dbd-82v7b"] Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.679429 4990 scope.go:117] "RemoveContainer" containerID="fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6" Oct 03 10:03:19 crc kubenswrapper[4990]: E1003 10:03:19.681972 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6\": container with ID starting with fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6 not found: ID does not exist" containerID="fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.682022 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6"} err="failed to get container status \"fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6\": rpc error: code = NotFound desc = could not find container \"fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6\": container with ID starting with fbb668c5302044e542e1cd8a244bd1b8b4d792d6f644bd2c7f753efabd5fe0f6 not found: ID does not exist" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.682055 4990 scope.go:117] "RemoveContainer" containerID="9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654" Oct 03 10:03:19 crc kubenswrapper[4990]: E1003 10:03:19.682396 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654\": container with ID starting with 9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654 not found: ID does not exist" containerID="9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.682428 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654"} err="failed to get container status \"9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654\": rpc error: code = NotFound desc = could not find container \"9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654\": container with ID starting with 9a4f35042acf46989b9f338635e67fc5191d6e0e156bbea7dda7193f4d4be654 not found: ID does not exist" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.929559 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6958c"] Oct 03 10:03:19 crc kubenswrapper[4990]: E1003 10:03:19.930223 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5adde993-f568-41c7-918c-4b03eb25e560" containerName="neutron-httpd" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.930240 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5adde993-f568-41c7-918c-4b03eb25e560" containerName="neutron-httpd" Oct 03 10:03:19 crc kubenswrapper[4990]: E1003 10:03:19.930265 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5adde993-f568-41c7-918c-4b03eb25e560" containerName="neutron-api" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.930272 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5adde993-f568-41c7-918c-4b03eb25e560" containerName="neutron-api" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.930442 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5adde993-f568-41c7-918c-4b03eb25e560" containerName="neutron-httpd" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.930461 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5adde993-f568-41c7-918c-4b03eb25e560" containerName="neutron-api" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.931064 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6958c" Oct 03 10:03:19 crc kubenswrapper[4990]: I1003 10:03:19.943696 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6958c"] Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.028530 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-lh9qf"] Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.029943 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lh9qf" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.034715 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tfsb\" (UniqueName: \"kubernetes.io/projected/fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d-kube-api-access-2tfsb\") pod \"nova-api-db-create-6958c\" (UID: \"fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d\") " pod="openstack/nova-api-db-create-6958c" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.049825 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lh9qf"] Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.136595 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stbnx\" (UniqueName: \"kubernetes.io/projected/21b57ed7-8a33-445c-92fd-f95e0386fb1b-kube-api-access-stbnx\") pod \"nova-cell0-db-create-lh9qf\" (UID: \"21b57ed7-8a33-445c-92fd-f95e0386fb1b\") " pod="openstack/nova-cell0-db-create-lh9qf" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.136712 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tfsb\" (UniqueName: \"kubernetes.io/projected/fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d-kube-api-access-2tfsb\") pod \"nova-api-db-create-6958c\" (UID: \"fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d\") " pod="openstack/nova-api-db-create-6958c" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.157049 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tfsb\" (UniqueName: \"kubernetes.io/projected/fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d-kube-api-access-2tfsb\") pod \"nova-api-db-create-6958c\" (UID: \"fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d\") " pod="openstack/nova-api-db-create-6958c" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.238425 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stbnx\" (UniqueName: \"kubernetes.io/projected/21b57ed7-8a33-445c-92fd-f95e0386fb1b-kube-api-access-stbnx\") pod \"nova-cell0-db-create-lh9qf\" (UID: \"21b57ed7-8a33-445c-92fd-f95e0386fb1b\") " pod="openstack/nova-cell0-db-create-lh9qf" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.256030 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stbnx\" (UniqueName: \"kubernetes.io/projected/21b57ed7-8a33-445c-92fd-f95e0386fb1b-kube-api-access-stbnx\") pod \"nova-cell0-db-create-lh9qf\" (UID: \"21b57ed7-8a33-445c-92fd-f95e0386fb1b\") " pod="openstack/nova-cell0-db-create-lh9qf" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.289027 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6958c" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.308747 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bm57g"] Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.348349 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bm57g" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.375707 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lh9qf" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.376174 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bm57g"] Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.450505 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rtdh\" (UniqueName: \"kubernetes.io/projected/9b6639ec-52e5-498c-a2b3-f2d615b15c60-kube-api-access-5rtdh\") pod \"nova-cell1-db-create-bm57g\" (UID: \"9b6639ec-52e5-498c-a2b3-f2d615b15c60\") " pod="openstack/nova-cell1-db-create-bm57g" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.552103 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rtdh\" (UniqueName: \"kubernetes.io/projected/9b6639ec-52e5-498c-a2b3-f2d615b15c60-kube-api-access-5rtdh\") pod \"nova-cell1-db-create-bm57g\" (UID: \"9b6639ec-52e5-498c-a2b3-f2d615b15c60\") " pod="openstack/nova-cell1-db-create-bm57g" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.571972 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerStarted","Data":"1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b"} Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.583435 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rtdh\" (UniqueName: \"kubernetes.io/projected/9b6639ec-52e5-498c-a2b3-f2d615b15c60-kube-api-access-5rtdh\") pod \"nova-cell1-db-create-bm57g\" (UID: \"9b6639ec-52e5-498c-a2b3-f2d615b15c60\") " pod="openstack/nova-cell1-db-create-bm57g" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.708226 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bm57g" Oct 03 10:03:20 crc kubenswrapper[4990]: W1003 10:03:20.900055 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe39ff15_8ec8_49b9_b69d_4f17d64c6e9d.slice/crio-1177cb22e4e08de3780e78e0ae3d0a4ee420b809f945eadd6a3a59385a627a47 WatchSource:0}: Error finding container 1177cb22e4e08de3780e78e0ae3d0a4ee420b809f945eadd6a3a59385a627a47: Status 404 returned error can't find the container with id 1177cb22e4e08de3780e78e0ae3d0a4ee420b809f945eadd6a3a59385a627a47 Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.909365 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5adde993-f568-41c7-918c-4b03eb25e560" path="/var/lib/kubelet/pods/5adde993-f568-41c7-918c-4b03eb25e560/volumes" Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.909931 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6958c"] Oct 03 10:03:20 crc kubenswrapper[4990]: I1003 10:03:20.979252 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lh9qf"] Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.203403 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bm57g"] Oct 03 10:03:21 crc kubenswrapper[4990]: W1003 10:03:21.267445 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b6639ec_52e5_498c_a2b3_f2d615b15c60.slice/crio-6b5b68863ee70acdd8596b8af93b3fc0d4511c4a0556f91e0e1d1194aab0d48a WatchSource:0}: Error finding container 6b5b68863ee70acdd8596b8af93b3fc0d4511c4a0556f91e0e1d1194aab0d48a: Status 404 returned error can't find the container with id 6b5b68863ee70acdd8596b8af93b3fc0d4511c4a0556f91e0e1d1194aab0d48a Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.604600 4990 generic.go:334] "Generic (PLEG): container finished" podID="21b57ed7-8a33-445c-92fd-f95e0386fb1b" containerID="ae63f4c738228664295d529f1f2ce00d483a4a4e2629816aaf711c6d25cf258f" exitCode=0 Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.604703 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lh9qf" event={"ID":"21b57ed7-8a33-445c-92fd-f95e0386fb1b","Type":"ContainerDied","Data":"ae63f4c738228664295d529f1f2ce00d483a4a4e2629816aaf711c6d25cf258f"} Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.604984 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lh9qf" event={"ID":"21b57ed7-8a33-445c-92fd-f95e0386fb1b","Type":"ContainerStarted","Data":"5b4d768a52cee5395669d2696a43ef5620588258bb8beb6bff606cf0e3c6bf40"} Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.606586 4990 generic.go:334] "Generic (PLEG): container finished" podID="fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d" containerID="164df144161995268ea85b29c9b4a158f835548115b6ddb54a76c9316445f8ad" exitCode=0 Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.606662 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6958c" event={"ID":"fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d","Type":"ContainerDied","Data":"164df144161995268ea85b29c9b4a158f835548115b6ddb54a76c9316445f8ad"} Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.606849 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6958c" event={"ID":"fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d","Type":"ContainerStarted","Data":"1177cb22e4e08de3780e78e0ae3d0a4ee420b809f945eadd6a3a59385a627a47"} Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.608125 4990 generic.go:334] "Generic (PLEG): container finished" podID="9b6639ec-52e5-498c-a2b3-f2d615b15c60" containerID="770d83938a22e570b023d61ae7a6544022cebee19724127ed29717752e5252c0" exitCode=0 Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.608170 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bm57g" event={"ID":"9b6639ec-52e5-498c-a2b3-f2d615b15c60","Type":"ContainerDied","Data":"770d83938a22e570b023d61ae7a6544022cebee19724127ed29717752e5252c0"} Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.608831 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bm57g" event={"ID":"9b6639ec-52e5-498c-a2b3-f2d615b15c60","Type":"ContainerStarted","Data":"6b5b68863ee70acdd8596b8af93b3fc0d4511c4a0556f91e0e1d1194aab0d48a"} Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.609955 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerStarted","Data":"8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434"} Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.610007 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.610138 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 10:03:21 crc kubenswrapper[4990]: I1003 10:03:21.900204 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 10:03:22 crc kubenswrapper[4990]: I1003 10:03:22.072023 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 10:03:22 crc kubenswrapper[4990]: I1003 10:03:22.621421 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerStarted","Data":"2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3"} Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.063232 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bm57g" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.112522 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rtdh\" (UniqueName: \"kubernetes.io/projected/9b6639ec-52e5-498c-a2b3-f2d615b15c60-kube-api-access-5rtdh\") pod \"9b6639ec-52e5-498c-a2b3-f2d615b15c60\" (UID: \"9b6639ec-52e5-498c-a2b3-f2d615b15c60\") " Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.126918 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6639ec-52e5-498c-a2b3-f2d615b15c60-kube-api-access-5rtdh" (OuterVolumeSpecName: "kube-api-access-5rtdh") pod "9b6639ec-52e5-498c-a2b3-f2d615b15c60" (UID: "9b6639ec-52e5-498c-a2b3-f2d615b15c60"). InnerVolumeSpecName "kube-api-access-5rtdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.200168 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.215030 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rtdh\" (UniqueName: \"kubernetes.io/projected/9b6639ec-52e5-498c-a2b3-f2d615b15c60-kube-api-access-5rtdh\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.231253 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lh9qf" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.237476 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6958c" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.320928 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stbnx\" (UniqueName: \"kubernetes.io/projected/21b57ed7-8a33-445c-92fd-f95e0386fb1b-kube-api-access-stbnx\") pod \"21b57ed7-8a33-445c-92fd-f95e0386fb1b\" (UID: \"21b57ed7-8a33-445c-92fd-f95e0386fb1b\") " Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.321082 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tfsb\" (UniqueName: \"kubernetes.io/projected/fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d-kube-api-access-2tfsb\") pod \"fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d\" (UID: \"fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d\") " Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.341732 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d-kube-api-access-2tfsb" (OuterVolumeSpecName: "kube-api-access-2tfsb") pod "fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d" (UID: "fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d"). InnerVolumeSpecName "kube-api-access-2tfsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.348341 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b57ed7-8a33-445c-92fd-f95e0386fb1b-kube-api-access-stbnx" (OuterVolumeSpecName: "kube-api-access-stbnx") pod "21b57ed7-8a33-445c-92fd-f95e0386fb1b" (UID: "21b57ed7-8a33-445c-92fd-f95e0386fb1b"). InnerVolumeSpecName "kube-api-access-stbnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.424205 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stbnx\" (UniqueName: \"kubernetes.io/projected/21b57ed7-8a33-445c-92fd-f95e0386fb1b-kube-api-access-stbnx\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.424255 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tfsb\" (UniqueName: \"kubernetes.io/projected/fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d-kube-api-access-2tfsb\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.634157 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lh9qf" event={"ID":"21b57ed7-8a33-445c-92fd-f95e0386fb1b","Type":"ContainerDied","Data":"5b4d768a52cee5395669d2696a43ef5620588258bb8beb6bff606cf0e3c6bf40"} Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.634803 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4d768a52cee5395669d2696a43ef5620588258bb8beb6bff606cf0e3c6bf40" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.634905 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lh9qf" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.637649 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6958c" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.637642 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6958c" event={"ID":"fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d","Type":"ContainerDied","Data":"1177cb22e4e08de3780e78e0ae3d0a4ee420b809f945eadd6a3a59385a627a47"} Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.637717 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1177cb22e4e08de3780e78e0ae3d0a4ee420b809f945eadd6a3a59385a627a47" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.639254 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bm57g" event={"ID":"9b6639ec-52e5-498c-a2b3-f2d615b15c60","Type":"ContainerDied","Data":"6b5b68863ee70acdd8596b8af93b3fc0d4511c4a0556f91e0e1d1194aab0d48a"} Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.639291 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b5b68863ee70acdd8596b8af93b3fc0d4511c4a0556f91e0e1d1194aab0d48a" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.639296 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bm57g" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.642063 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="ceilometer-central-agent" containerID="cri-o://1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b" gracePeriod=30 Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.642454 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerStarted","Data":"86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635"} Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.642547 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.642630 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="ceilometer-notification-agent" containerID="cri-o://8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434" gracePeriod=30 Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.642630 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="proxy-httpd" containerID="cri-o://86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635" gracePeriod=30 Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.642663 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="sg-core" containerID="cri-o://2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3" gracePeriod=30 Oct 03 10:03:23 crc kubenswrapper[4990]: I1003 10:03:23.677379 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8622535550000001 podStartE2EDuration="6.677356765s" podCreationTimestamp="2025-10-03 10:03:17 +0000 UTC" firstStartedPulling="2025-10-03 10:03:18.484278497 +0000 UTC m=+1180.280910354" lastFinishedPulling="2025-10-03 10:03:23.299381707 +0000 UTC m=+1185.096013564" observedRunningTime="2025-10-03 10:03:23.675690444 +0000 UTC m=+1185.472322311" watchObservedRunningTime="2025-10-03 10:03:23.677356765 +0000 UTC m=+1185.473988622" Oct 03 10:03:24 crc kubenswrapper[4990]: I1003 10:03:24.653182 4990 generic.go:334] "Generic (PLEG): container finished" podID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerID="86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635" exitCode=0 Oct 03 10:03:24 crc kubenswrapper[4990]: I1003 10:03:24.653504 4990 generic.go:334] "Generic (PLEG): container finished" podID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerID="2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3" exitCode=2 Oct 03 10:03:24 crc kubenswrapper[4990]: I1003 10:03:24.653530 4990 generic.go:334] "Generic (PLEG): container finished" podID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerID="8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434" exitCode=0 Oct 03 10:03:24 crc kubenswrapper[4990]: I1003 10:03:24.653260 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerDied","Data":"86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635"} Oct 03 10:03:24 crc kubenswrapper[4990]: I1003 10:03:24.653570 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerDied","Data":"2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3"} Oct 03 10:03:24 crc kubenswrapper[4990]: I1003 10:03:24.653586 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerDied","Data":"8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434"} Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.676113 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.704533 4990 generic.go:334] "Generic (PLEG): container finished" podID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerID="1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b" exitCode=0 Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.704575 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerDied","Data":"1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b"} Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.704600 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec","Type":"ContainerDied","Data":"b731dc3a3cf0bfda768c57a27171ad6a6e285c567851df525cf85aacbbcd2e97"} Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.704616 4990 scope.go:117] "RemoveContainer" containerID="86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.704746 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.731356 4990 scope.go:117] "RemoveContainer" containerID="2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.732714 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-config-data\") pod \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.732762 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-log-httpd\") pod \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.732855 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-combined-ca-bundle\") pod \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.732901 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-scripts\") pod \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.732962 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6l7\" (UniqueName: \"kubernetes.io/projected/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-kube-api-access-ks6l7\") pod \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.733036 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-run-httpd\") pod \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.733083 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-sg-core-conf-yaml\") pod \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\" (UID: \"9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec\") " Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.734068 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" (UID: "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.734257 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" (UID: "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.744355 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-kube-api-access-ks6l7" (OuterVolumeSpecName: "kube-api-access-ks6l7") pod "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" (UID: "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec"). InnerVolumeSpecName "kube-api-access-ks6l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.744375 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-scripts" (OuterVolumeSpecName: "scripts") pod "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" (UID: "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.775791 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" (UID: "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.790681 4990 scope.go:117] "RemoveContainer" containerID="8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.832169 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" (UID: "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.835156 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.835190 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.835201 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.835209 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks6l7\" (UniqueName: \"kubernetes.io/projected/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-kube-api-access-ks6l7\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.835220 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.835228 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.864450 4990 scope.go:117] "RemoveContainer" containerID="1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.865948 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-config-data" (OuterVolumeSpecName: "config-data") pod "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" (UID: "9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.889061 4990 scope.go:117] "RemoveContainer" containerID="86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635" Oct 03 10:03:28 crc kubenswrapper[4990]: E1003 10:03:28.889440 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635\": container with ID starting with 86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635 not found: ID does not exist" containerID="86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.889475 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635"} err="failed to get container status \"86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635\": rpc error: code = NotFound desc = could not find container \"86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635\": container with ID starting with 86e2c03d14a0ce56d8a38ac53ec9e7f00c65be94b328658af52b8fd1794c7635 not found: ID does not exist" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.889503 4990 scope.go:117] "RemoveContainer" containerID="2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3" Oct 03 10:03:28 crc kubenswrapper[4990]: E1003 10:03:28.890710 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3\": container with ID starting with 2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3 not found: ID does not exist" containerID="2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.890753 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3"} err="failed to get container status \"2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3\": rpc error: code = NotFound desc = could not find container \"2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3\": container with ID starting with 2755a636f7d5e8e36c341bb5115d84d22dcd4610cebf5503161c3093317fadf3 not found: ID does not exist" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.890778 4990 scope.go:117] "RemoveContainer" containerID="8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434" Oct 03 10:03:28 crc kubenswrapper[4990]: E1003 10:03:28.891012 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434\": container with ID starting with 8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434 not found: ID does not exist" containerID="8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.891034 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434"} err="failed to get container status \"8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434\": rpc error: code = NotFound desc = could not find container \"8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434\": container with ID starting with 8ce67169dca5e60519ddc5c5ceb9536862a8b0d0d45749f562716dfff315a434 not found: ID does not exist" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.891046 4990 scope.go:117] "RemoveContainer" containerID="1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b" Oct 03 10:03:28 crc kubenswrapper[4990]: E1003 10:03:28.891265 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b\": container with ID starting with 1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b not found: ID does not exist" containerID="1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.891296 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b"} err="failed to get container status \"1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b\": rpc error: code = NotFound desc = could not find container \"1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b\": container with ID starting with 1e02fc576eb16b697636d711d6058b3127542f95887597877fd1d6a539bf326b not found: ID does not exist" Oct 03 10:03:28 crc kubenswrapper[4990]: I1003 10:03:28.937048 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.041828 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.053185 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.066703 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:29 crc kubenswrapper[4990]: E1003 10:03:29.067185 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="proxy-httpd" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067202 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="proxy-httpd" Oct 03 10:03:29 crc kubenswrapper[4990]: E1003 10:03:29.067229 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067239 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: E1003 10:03:29.067251 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="ceilometer-central-agent" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067260 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="ceilometer-central-agent" Oct 03 10:03:29 crc kubenswrapper[4990]: E1003 10:03:29.067275 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6639ec-52e5-498c-a2b3-f2d615b15c60" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067283 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6639ec-52e5-498c-a2b3-f2d615b15c60" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: E1003 10:03:29.067304 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="sg-core" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067312 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="sg-core" Oct 03 10:03:29 crc kubenswrapper[4990]: E1003 10:03:29.067329 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b57ed7-8a33-445c-92fd-f95e0386fb1b" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067336 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b57ed7-8a33-445c-92fd-f95e0386fb1b" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: E1003 10:03:29.067345 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="ceilometer-notification-agent" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067353 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="ceilometer-notification-agent" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067615 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b57ed7-8a33-445c-92fd-f95e0386fb1b" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067634 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="proxy-httpd" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067647 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="ceilometer-notification-agent" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067662 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6639ec-52e5-498c-a2b3-f2d615b15c60" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067681 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d" containerName="mariadb-database-create" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067695 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="sg-core" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.067704 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" containerName="ceilometer-central-agent" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.069723 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.074282 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.074483 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.104167 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.141041 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-config-data\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.141099 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z46hf\" (UniqueName: \"kubernetes.io/projected/71c6fe72-342f-46d2-9aa1-772a41255500-kube-api-access-z46hf\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.141135 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.141286 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.141369 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-run-httpd\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.141542 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-scripts\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.141630 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-log-httpd\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.243148 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z46hf\" (UniqueName: \"kubernetes.io/projected/71c6fe72-342f-46d2-9aa1-772a41255500-kube-api-access-z46hf\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.243228 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.243267 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.243309 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-run-httpd\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.243367 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-scripts\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.243404 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-log-httpd\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.243559 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-config-data\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.243930 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-run-httpd\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.244052 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-log-httpd\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.247467 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.248072 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-config-data\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.248911 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.255052 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-scripts\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.261474 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z46hf\" (UniqueName: \"kubernetes.io/projected/71c6fe72-342f-46d2-9aa1-772a41255500-kube-api-access-z46hf\") pod \"ceilometer-0\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.417387 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:29 crc kubenswrapper[4990]: I1003 10:03:29.901632 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:29 crc kubenswrapper[4990]: W1003 10:03:29.910360 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71c6fe72_342f_46d2_9aa1_772a41255500.slice/crio-57316d90ff97d479ae28c5eef8d19cf9998ff38adc9701e74138dd93e64577c8 WatchSource:0}: Error finding container 57316d90ff97d479ae28c5eef8d19cf9998ff38adc9701e74138dd93e64577c8: Status 404 returned error can't find the container with id 57316d90ff97d479ae28c5eef8d19cf9998ff38adc9701e74138dd93e64577c8 Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.174434 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c48a-account-create-pc4h8"] Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.175687 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c48a-account-create-pc4h8" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.177258 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.185762 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c48a-account-create-pc4h8"] Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.264558 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqgrw\" (UniqueName: \"kubernetes.io/projected/bbc84de3-47d9-42cd-9f9d-f4bd014d57ef-kube-api-access-gqgrw\") pod \"nova-api-c48a-account-create-pc4h8\" (UID: \"bbc84de3-47d9-42cd-9f9d-f4bd014d57ef\") " pod="openstack/nova-api-c48a-account-create-pc4h8" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.296537 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5001-account-create-zstjs"] Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.298129 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5001-account-create-zstjs" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.304861 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5001-account-create-zstjs"] Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.307981 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.365947 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgfw\" (UniqueName: \"kubernetes.io/projected/50b4be29-bab9-4e12-ada2-152f89f790bc-kube-api-access-csgfw\") pod \"nova-cell0-5001-account-create-zstjs\" (UID: \"50b4be29-bab9-4e12-ada2-152f89f790bc\") " pod="openstack/nova-cell0-5001-account-create-zstjs" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.366101 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqgrw\" (UniqueName: \"kubernetes.io/projected/bbc84de3-47d9-42cd-9f9d-f4bd014d57ef-kube-api-access-gqgrw\") pod \"nova-api-c48a-account-create-pc4h8\" (UID: \"bbc84de3-47d9-42cd-9f9d-f4bd014d57ef\") " pod="openstack/nova-api-c48a-account-create-pc4h8" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.390889 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqgrw\" (UniqueName: \"kubernetes.io/projected/bbc84de3-47d9-42cd-9f9d-f4bd014d57ef-kube-api-access-gqgrw\") pod \"nova-api-c48a-account-create-pc4h8\" (UID: \"bbc84de3-47d9-42cd-9f9d-f4bd014d57ef\") " pod="openstack/nova-api-c48a-account-create-pc4h8" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.463942 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fc5d-account-create-kwx27"] Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.465086 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fc5d-account-create-kwx27" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.470076 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.470072 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgfw\" (UniqueName: \"kubernetes.io/projected/50b4be29-bab9-4e12-ada2-152f89f790bc-kube-api-access-csgfw\") pod \"nova-cell0-5001-account-create-zstjs\" (UID: \"50b4be29-bab9-4e12-ada2-152f89f790bc\") " pod="openstack/nova-cell0-5001-account-create-zstjs" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.474934 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fc5d-account-create-kwx27"] Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.490270 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgfw\" (UniqueName: \"kubernetes.io/projected/50b4be29-bab9-4e12-ada2-152f89f790bc-kube-api-access-csgfw\") pod \"nova-cell0-5001-account-create-zstjs\" (UID: \"50b4be29-bab9-4e12-ada2-152f89f790bc\") " pod="openstack/nova-cell0-5001-account-create-zstjs" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.501913 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c48a-account-create-pc4h8" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.572740 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stsfw\" (UniqueName: \"kubernetes.io/projected/4ae7b8b5-463c-4844-9b60-1e87c5681693-kube-api-access-stsfw\") pod \"nova-cell1-fc5d-account-create-kwx27\" (UID: \"4ae7b8b5-463c-4844-9b60-1e87c5681693\") " pod="openstack/nova-cell1-fc5d-account-create-kwx27" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.632529 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5001-account-create-zstjs" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.678026 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stsfw\" (UniqueName: \"kubernetes.io/projected/4ae7b8b5-463c-4844-9b60-1e87c5681693-kube-api-access-stsfw\") pod \"nova-cell1-fc5d-account-create-kwx27\" (UID: \"4ae7b8b5-463c-4844-9b60-1e87c5681693\") " pod="openstack/nova-cell1-fc5d-account-create-kwx27" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.703241 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stsfw\" (UniqueName: \"kubernetes.io/projected/4ae7b8b5-463c-4844-9b60-1e87c5681693-kube-api-access-stsfw\") pod \"nova-cell1-fc5d-account-create-kwx27\" (UID: \"4ae7b8b5-463c-4844-9b60-1e87c5681693\") " pod="openstack/nova-cell1-fc5d-account-create-kwx27" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.741637 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerStarted","Data":"57316d90ff97d479ae28c5eef8d19cf9998ff38adc9701e74138dd93e64577c8"} Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.885219 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec" path="/var/lib/kubelet/pods/9fd35e33-5f62-49a8-bea2-3c3bd8ec01ec/volumes" Oct 03 10:03:30 crc kubenswrapper[4990]: I1003 10:03:30.938422 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fc5d-account-create-kwx27" Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.029350 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c48a-account-create-pc4h8"] Oct 03 10:03:31 crc kubenswrapper[4990]: W1003 10:03:31.037405 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbc84de3_47d9_42cd_9f9d_f4bd014d57ef.slice/crio-68ddd2d94ba0b71e8ef89a16b1a376b3aa48bbe25c13db082bd999737674334f WatchSource:0}: Error finding container 68ddd2d94ba0b71e8ef89a16b1a376b3aa48bbe25c13db082bd999737674334f: Status 404 returned error can't find the container with id 68ddd2d94ba0b71e8ef89a16b1a376b3aa48bbe25c13db082bd999737674334f Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.084602 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.176949 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5001-account-create-zstjs"] Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.460110 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fc5d-account-create-kwx27"] Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.758026 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerStarted","Data":"a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06"} Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.758068 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerStarted","Data":"d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43"} Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.759801 4990 generic.go:334] "Generic (PLEG): container finished" podID="bbc84de3-47d9-42cd-9f9d-f4bd014d57ef" containerID="4128adaddf77e7ec00c9f9e607647913aef983b6901b7e19ceeba0f2e8ed8bbd" exitCode=0 Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.759883 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c48a-account-create-pc4h8" event={"ID":"bbc84de3-47d9-42cd-9f9d-f4bd014d57ef","Type":"ContainerDied","Data":"4128adaddf77e7ec00c9f9e607647913aef983b6901b7e19ceeba0f2e8ed8bbd"} Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.759909 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c48a-account-create-pc4h8" event={"ID":"bbc84de3-47d9-42cd-9f9d-f4bd014d57ef","Type":"ContainerStarted","Data":"68ddd2d94ba0b71e8ef89a16b1a376b3aa48bbe25c13db082bd999737674334f"} Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.761535 4990 generic.go:334] "Generic (PLEG): container finished" podID="50b4be29-bab9-4e12-ada2-152f89f790bc" containerID="adffc8be595f8d01d2798d1490842d5e7a8588b528aebf87a6717bb8e399160b" exitCode=0 Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.761685 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5001-account-create-zstjs" event={"ID":"50b4be29-bab9-4e12-ada2-152f89f790bc","Type":"ContainerDied","Data":"adffc8be595f8d01d2798d1490842d5e7a8588b528aebf87a6717bb8e399160b"} Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.761715 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5001-account-create-zstjs" event={"ID":"50b4be29-bab9-4e12-ada2-152f89f790bc","Type":"ContainerStarted","Data":"73aa2c96539c9fb912eabfb8147c3728709350cd986a69004dcd240e54c7914e"} Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.763219 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fc5d-account-create-kwx27" event={"ID":"4ae7b8b5-463c-4844-9b60-1e87c5681693","Type":"ContainerStarted","Data":"27ee6574dd58ed3cc5609e5e3cf295930d7487f08737578441411bd7326a416d"} Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.763243 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fc5d-account-create-kwx27" event={"ID":"4ae7b8b5-463c-4844-9b60-1e87c5681693","Type":"ContainerStarted","Data":"5833feecc7551569804db7b0bf751039313b3632b31364168889199a569c7d6e"} Oct 03 10:03:31 crc kubenswrapper[4990]: I1003 10:03:31.787115 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-fc5d-account-create-kwx27" podStartSLOduration=1.787095665 podStartE2EDuration="1.787095665s" podCreationTimestamp="2025-10-03 10:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:03:31.786844859 +0000 UTC m=+1193.583476716" watchObservedRunningTime="2025-10-03 10:03:31.787095665 +0000 UTC m=+1193.583727542" Oct 03 10:03:32 crc kubenswrapper[4990]: I1003 10:03:32.775070 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerStarted","Data":"40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d"} Oct 03 10:03:32 crc kubenswrapper[4990]: I1003 10:03:32.776590 4990 generic.go:334] "Generic (PLEG): container finished" podID="4ae7b8b5-463c-4844-9b60-1e87c5681693" containerID="27ee6574dd58ed3cc5609e5e3cf295930d7487f08737578441411bd7326a416d" exitCode=0 Oct 03 10:03:32 crc kubenswrapper[4990]: I1003 10:03:32.776693 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fc5d-account-create-kwx27" event={"ID":"4ae7b8b5-463c-4844-9b60-1e87c5681693","Type":"ContainerDied","Data":"27ee6574dd58ed3cc5609e5e3cf295930d7487f08737578441411bd7326a416d"} Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.242398 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5001-account-create-zstjs" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.252674 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c48a-account-create-pc4h8" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.339408 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csgfw\" (UniqueName: \"kubernetes.io/projected/50b4be29-bab9-4e12-ada2-152f89f790bc-kube-api-access-csgfw\") pod \"50b4be29-bab9-4e12-ada2-152f89f790bc\" (UID: \"50b4be29-bab9-4e12-ada2-152f89f790bc\") " Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.339486 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqgrw\" (UniqueName: \"kubernetes.io/projected/bbc84de3-47d9-42cd-9f9d-f4bd014d57ef-kube-api-access-gqgrw\") pod \"bbc84de3-47d9-42cd-9f9d-f4bd014d57ef\" (UID: \"bbc84de3-47d9-42cd-9f9d-f4bd014d57ef\") " Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.349052 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc84de3-47d9-42cd-9f9d-f4bd014d57ef-kube-api-access-gqgrw" (OuterVolumeSpecName: "kube-api-access-gqgrw") pod "bbc84de3-47d9-42cd-9f9d-f4bd014d57ef" (UID: "bbc84de3-47d9-42cd-9f9d-f4bd014d57ef"). InnerVolumeSpecName "kube-api-access-gqgrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.354783 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b4be29-bab9-4e12-ada2-152f89f790bc-kube-api-access-csgfw" (OuterVolumeSpecName: "kube-api-access-csgfw") pod "50b4be29-bab9-4e12-ada2-152f89f790bc" (UID: "50b4be29-bab9-4e12-ada2-152f89f790bc"). InnerVolumeSpecName "kube-api-access-csgfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.442203 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csgfw\" (UniqueName: \"kubernetes.io/projected/50b4be29-bab9-4e12-ada2-152f89f790bc-kube-api-access-csgfw\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.442262 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqgrw\" (UniqueName: \"kubernetes.io/projected/bbc84de3-47d9-42cd-9f9d-f4bd014d57ef-kube-api-access-gqgrw\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.787349 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c48a-account-create-pc4h8" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.787340 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c48a-account-create-pc4h8" event={"ID":"bbc84de3-47d9-42cd-9f9d-f4bd014d57ef","Type":"ContainerDied","Data":"68ddd2d94ba0b71e8ef89a16b1a376b3aa48bbe25c13db082bd999737674334f"} Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.787547 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68ddd2d94ba0b71e8ef89a16b1a376b3aa48bbe25c13db082bd999737674334f" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.789263 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5001-account-create-zstjs" event={"ID":"50b4be29-bab9-4e12-ada2-152f89f790bc","Type":"ContainerDied","Data":"73aa2c96539c9fb912eabfb8147c3728709350cd986a69004dcd240e54c7914e"} Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.789303 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73aa2c96539c9fb912eabfb8147c3728709350cd986a69004dcd240e54c7914e" Oct 03 10:03:33 crc kubenswrapper[4990]: I1003 10:03:33.789288 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5001-account-create-zstjs" Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.211884 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fc5d-account-create-kwx27" Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.256235 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stsfw\" (UniqueName: \"kubernetes.io/projected/4ae7b8b5-463c-4844-9b60-1e87c5681693-kube-api-access-stsfw\") pod \"4ae7b8b5-463c-4844-9b60-1e87c5681693\" (UID: \"4ae7b8b5-463c-4844-9b60-1e87c5681693\") " Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.260766 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae7b8b5-463c-4844-9b60-1e87c5681693-kube-api-access-stsfw" (OuterVolumeSpecName: "kube-api-access-stsfw") pod "4ae7b8b5-463c-4844-9b60-1e87c5681693" (UID: "4ae7b8b5-463c-4844-9b60-1e87c5681693"). InnerVolumeSpecName "kube-api-access-stsfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.358809 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stsfw\" (UniqueName: \"kubernetes.io/projected/4ae7b8b5-463c-4844-9b60-1e87c5681693-kube-api-access-stsfw\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.828127 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerStarted","Data":"9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564"} Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.828283 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="sg-core" containerID="cri-o://40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d" gracePeriod=30 Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.828292 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="proxy-httpd" containerID="cri-o://9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564" gracePeriod=30 Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.828288 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="ceilometer-central-agent" containerID="cri-o://d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43" gracePeriod=30 Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.828373 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="ceilometer-notification-agent" containerID="cri-o://a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06" gracePeriod=30 Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.828492 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.831662 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fc5d-account-create-kwx27" event={"ID":"4ae7b8b5-463c-4844-9b60-1e87c5681693","Type":"ContainerDied","Data":"5833feecc7551569804db7b0bf751039313b3632b31364168889199a569c7d6e"} Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.831707 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5833feecc7551569804db7b0bf751039313b3632b31364168889199a569c7d6e" Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.831775 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fc5d-account-create-kwx27" Oct 03 10:03:34 crc kubenswrapper[4990]: I1003 10:03:34.854379 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7185792640000002 podStartE2EDuration="5.854357185s" podCreationTimestamp="2025-10-03 10:03:29 +0000 UTC" firstStartedPulling="2025-10-03 10:03:29.914817196 +0000 UTC m=+1191.711449053" lastFinishedPulling="2025-10-03 10:03:34.050595117 +0000 UTC m=+1195.847226974" observedRunningTime="2025-10-03 10:03:34.849649977 +0000 UTC m=+1196.646281864" watchObservedRunningTime="2025-10-03 10:03:34.854357185 +0000 UTC m=+1196.650989042" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.474931 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p67ht"] Oct 03 10:03:35 crc kubenswrapper[4990]: E1003 10:03:35.475771 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc84de3-47d9-42cd-9f9d-f4bd014d57ef" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.475807 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc84de3-47d9-42cd-9f9d-f4bd014d57ef" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: E1003 10:03:35.475836 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae7b8b5-463c-4844-9b60-1e87c5681693" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.475847 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae7b8b5-463c-4844-9b60-1e87c5681693" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: E1003 10:03:35.475863 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b4be29-bab9-4e12-ada2-152f89f790bc" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.475873 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b4be29-bab9-4e12-ada2-152f89f790bc" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.476142 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc84de3-47d9-42cd-9f9d-f4bd014d57ef" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.476167 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b4be29-bab9-4e12-ada2-152f89f790bc" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.476186 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae7b8b5-463c-4844-9b60-1e87c5681693" containerName="mariadb-account-create" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.476908 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.478533 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.481533 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.481875 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mp8qd" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.496380 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p67ht"] Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.584634 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-scripts\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.584684 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5zf\" (UniqueName: \"kubernetes.io/projected/bbf25005-f050-4d64-bbeb-19faa36c9d43-kube-api-access-lc5zf\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.584753 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.584882 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-config-data\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.687696 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-scripts\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.687759 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5zf\" (UniqueName: \"kubernetes.io/projected/bbf25005-f050-4d64-bbeb-19faa36c9d43-kube-api-access-lc5zf\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.687831 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.687939 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-config-data\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.695325 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.699453 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-config-data\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.704245 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-scripts\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.744292 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5zf\" (UniqueName: \"kubernetes.io/projected/bbf25005-f050-4d64-bbeb-19faa36c9d43-kube-api-access-lc5zf\") pod \"nova-cell0-conductor-db-sync-p67ht\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.796184 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.845258 4990 generic.go:334] "Generic (PLEG): container finished" podID="71c6fe72-342f-46d2-9aa1-772a41255500" containerID="9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564" exitCode=0 Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.845297 4990 generic.go:334] "Generic (PLEG): container finished" podID="71c6fe72-342f-46d2-9aa1-772a41255500" containerID="40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d" exitCode=2 Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.845312 4990 generic.go:334] "Generic (PLEG): container finished" podID="71c6fe72-342f-46d2-9aa1-772a41255500" containerID="a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06" exitCode=0 Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.845323 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerDied","Data":"9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564"} Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.845364 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerDied","Data":"40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d"} Oct 03 10:03:35 crc kubenswrapper[4990]: I1003 10:03:35.845379 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerDied","Data":"a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06"} Oct 03 10:03:36 crc kubenswrapper[4990]: I1003 10:03:36.239975 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p67ht"] Oct 03 10:03:36 crc kubenswrapper[4990]: W1003 10:03:36.250334 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbf25005_f050_4d64_bbeb_19faa36c9d43.slice/crio-2fa089e6bd20fec9432dc9ab555bfe077cdffbbc1dfbc30086d10d6766fb0e6f WatchSource:0}: Error finding container 2fa089e6bd20fec9432dc9ab555bfe077cdffbbc1dfbc30086d10d6766fb0e6f: Status 404 returned error can't find the container with id 2fa089e6bd20fec9432dc9ab555bfe077cdffbbc1dfbc30086d10d6766fb0e6f Oct 03 10:03:36 crc kubenswrapper[4990]: I1003 10:03:36.854639 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p67ht" event={"ID":"bbf25005-f050-4d64-bbeb-19faa36c9d43","Type":"ContainerStarted","Data":"2fa089e6bd20fec9432dc9ab555bfe077cdffbbc1dfbc30086d10d6766fb0e6f"} Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.493411 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.544222 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-config-data\") pod \"71c6fe72-342f-46d2-9aa1-772a41255500\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.544273 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-scripts\") pod \"71c6fe72-342f-46d2-9aa1-772a41255500\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.544321 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-sg-core-conf-yaml\") pod \"71c6fe72-342f-46d2-9aa1-772a41255500\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.544355 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-run-httpd\") pod \"71c6fe72-342f-46d2-9aa1-772a41255500\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.544467 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z46hf\" (UniqueName: \"kubernetes.io/projected/71c6fe72-342f-46d2-9aa1-772a41255500-kube-api-access-z46hf\") pod \"71c6fe72-342f-46d2-9aa1-772a41255500\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.544493 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-log-httpd\") pod \"71c6fe72-342f-46d2-9aa1-772a41255500\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.544635 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-combined-ca-bundle\") pod \"71c6fe72-342f-46d2-9aa1-772a41255500\" (UID: \"71c6fe72-342f-46d2-9aa1-772a41255500\") " Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.544975 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71c6fe72-342f-46d2-9aa1-772a41255500" (UID: "71c6fe72-342f-46d2-9aa1-772a41255500"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.545417 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.545399 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71c6fe72-342f-46d2-9aa1-772a41255500" (UID: "71c6fe72-342f-46d2-9aa1-772a41255500"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.550885 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-scripts" (OuterVolumeSpecName: "scripts") pod "71c6fe72-342f-46d2-9aa1-772a41255500" (UID: "71c6fe72-342f-46d2-9aa1-772a41255500"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.551477 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c6fe72-342f-46d2-9aa1-772a41255500-kube-api-access-z46hf" (OuterVolumeSpecName: "kube-api-access-z46hf") pod "71c6fe72-342f-46d2-9aa1-772a41255500" (UID: "71c6fe72-342f-46d2-9aa1-772a41255500"). InnerVolumeSpecName "kube-api-access-z46hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.572154 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71c6fe72-342f-46d2-9aa1-772a41255500" (UID: "71c6fe72-342f-46d2-9aa1-772a41255500"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.640694 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71c6fe72-342f-46d2-9aa1-772a41255500" (UID: "71c6fe72-342f-46d2-9aa1-772a41255500"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.647067 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.647288 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.647394 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.647484 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z46hf\" (UniqueName: \"kubernetes.io/projected/71c6fe72-342f-46d2-9aa1-772a41255500-kube-api-access-z46hf\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.647614 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71c6fe72-342f-46d2-9aa1-772a41255500-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.657697 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-config-data" (OuterVolumeSpecName: "config-data") pod "71c6fe72-342f-46d2-9aa1-772a41255500" (UID: "71c6fe72-342f-46d2-9aa1-772a41255500"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.749189 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c6fe72-342f-46d2-9aa1-772a41255500-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.869970 4990 generic.go:334] "Generic (PLEG): container finished" podID="71c6fe72-342f-46d2-9aa1-772a41255500" containerID="d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43" exitCode=0 Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.870020 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerDied","Data":"d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43"} Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.870051 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71c6fe72-342f-46d2-9aa1-772a41255500","Type":"ContainerDied","Data":"57316d90ff97d479ae28c5eef8d19cf9998ff38adc9701e74138dd93e64577c8"} Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.870069 4990 scope.go:117] "RemoveContainer" containerID="9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.870236 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.899728 4990 scope.go:117] "RemoveContainer" containerID="40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.927214 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.943075 4990 scope.go:117] "RemoveContainer" containerID="a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.948768 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.961275 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:37 crc kubenswrapper[4990]: E1003 10:03:37.961841 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="proxy-httpd" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.961864 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="proxy-httpd" Oct 03 10:03:37 crc kubenswrapper[4990]: E1003 10:03:37.961901 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="sg-core" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.961910 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="sg-core" Oct 03 10:03:37 crc kubenswrapper[4990]: E1003 10:03:37.961927 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="ceilometer-central-agent" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.961935 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="ceilometer-central-agent" Oct 03 10:03:37 crc kubenswrapper[4990]: E1003 10:03:37.961956 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="ceilometer-notification-agent" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.961964 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="ceilometer-notification-agent" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.962184 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="ceilometer-central-agent" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.962206 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="ceilometer-notification-agent" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.962217 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="proxy-httpd" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.962232 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" containerName="sg-core" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.964758 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.968876 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.969301 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 10:03:37 crc kubenswrapper[4990]: I1003 10:03:37.983233 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.000306 4990 scope.go:117] "RemoveContainer" containerID="d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.031636 4990 scope.go:117] "RemoveContainer" containerID="9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564" Oct 03 10:03:38 crc kubenswrapper[4990]: E1003 10:03:38.032150 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564\": container with ID starting with 9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564 not found: ID does not exist" containerID="9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.032191 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564"} err="failed to get container status \"9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564\": rpc error: code = NotFound desc = could not find container \"9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564\": container with ID starting with 9bdc6aae664221552b83e288546a8bb2901d904778de609eeca338ba7b7c5564 not found: ID does not exist" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.032219 4990 scope.go:117] "RemoveContainer" containerID="40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d" Oct 03 10:03:38 crc kubenswrapper[4990]: E1003 10:03:38.032698 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d\": container with ID starting with 40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d not found: ID does not exist" containerID="40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.032724 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d"} err="failed to get container status \"40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d\": rpc error: code = NotFound desc = could not find container \"40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d\": container with ID starting with 40e4d12e8d7985b2f21458d4272e3524d622588e96143720c3111cea4a85c24d not found: ID does not exist" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.032739 4990 scope.go:117] "RemoveContainer" containerID="a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06" Oct 03 10:03:38 crc kubenswrapper[4990]: E1003 10:03:38.032963 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06\": container with ID starting with a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06 not found: ID does not exist" containerID="a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.032989 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06"} err="failed to get container status \"a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06\": rpc error: code = NotFound desc = could not find container \"a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06\": container with ID starting with a3d5e208eb4ad52ee956e7c1a2167934fa5ec310ef88c2eacbac01d9f89eac06 not found: ID does not exist" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.033011 4990 scope.go:117] "RemoveContainer" containerID="d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43" Oct 03 10:03:38 crc kubenswrapper[4990]: E1003 10:03:38.033445 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43\": container with ID starting with d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43 not found: ID does not exist" containerID="d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.033485 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43"} err="failed to get container status \"d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43\": rpc error: code = NotFound desc = could not find container \"d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43\": container with ID starting with d2ed5adeb0ced05f2e76b60ee1d2e8c6265714b15f7a87dea3a8d4f923542b43 not found: ID does not exist" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.055821 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zj4h\" (UniqueName: \"kubernetes.io/projected/b504d2cf-6136-4df4-adca-e233c14dc47a-kube-api-access-6zj4h\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.055924 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-config-data\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.056053 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-run-httpd\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.056106 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.056431 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-log-httpd\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.056468 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-scripts\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.056717 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.158684 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.158756 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zj4h\" (UniqueName: \"kubernetes.io/projected/b504d2cf-6136-4df4-adca-e233c14dc47a-kube-api-access-6zj4h\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.158866 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-config-data\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.160117 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-run-httpd\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.160183 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.160222 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-log-httpd\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.160246 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-scripts\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.161406 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-log-httpd\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.161519 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-run-httpd\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.167063 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.167178 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-scripts\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.167513 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.172970 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-config-data\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.181330 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zj4h\" (UniqueName: \"kubernetes.io/projected/b504d2cf-6136-4df4-adca-e233c14dc47a-kube-api-access-6zj4h\") pod \"ceilometer-0\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.290561 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.784052 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.908016 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c6fe72-342f-46d2-9aa1-772a41255500" path="/var/lib/kubelet/pods/71c6fe72-342f-46d2-9aa1-772a41255500/volumes" Oct 03 10:03:38 crc kubenswrapper[4990]: I1003 10:03:38.908987 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerStarted","Data":"4efde896e24e291c52f408ff4563481723c2303a3b819a5c9c9538eb499b61f5"} Oct 03 10:03:43 crc kubenswrapper[4990]: I1003 10:03:43.178630 4990 scope.go:117] "RemoveContainer" containerID="d61fca09551d7f6d480e627523de4e901adcedfd108550b223b3ee76a03140b5" Oct 03 10:03:43 crc kubenswrapper[4990]: I1003 10:03:43.212196 4990 scope.go:117] "RemoveContainer" containerID="89e034e10465e3472936c27fbb39634d6db5ea147ab386aad010adb428272bdf" Oct 03 10:03:43 crc kubenswrapper[4990]: I1003 10:03:43.348636 4990 scope.go:117] "RemoveContainer" containerID="07e2e19978fe2628ac8a1e7ad64932fbeca11159cb5c38a9ac913a5716a285da" Oct 03 10:03:43 crc kubenswrapper[4990]: I1003 10:03:43.967814 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerStarted","Data":"c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0"} Oct 03 10:03:43 crc kubenswrapper[4990]: I1003 10:03:43.971182 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p67ht" event={"ID":"bbf25005-f050-4d64-bbeb-19faa36c9d43","Type":"ContainerStarted","Data":"73e5d97a45dc912b3b85601291d4c5afec6117cd29fbe2b1bc76a7237b673460"} Oct 03 10:03:43 crc kubenswrapper[4990]: I1003 10:03:43.995657 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-p67ht" podStartSLOduration=2.03528053 podStartE2EDuration="8.995638466s" podCreationTimestamp="2025-10-03 10:03:35 +0000 UTC" firstStartedPulling="2025-10-03 10:03:36.252915398 +0000 UTC m=+1198.049547255" lastFinishedPulling="2025-10-03 10:03:43.213273324 +0000 UTC m=+1205.009905191" observedRunningTime="2025-10-03 10:03:43.988762923 +0000 UTC m=+1205.785394790" watchObservedRunningTime="2025-10-03 10:03:43.995638466 +0000 UTC m=+1205.792270323" Oct 03 10:03:44 crc kubenswrapper[4990]: I1003 10:03:44.983917 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerStarted","Data":"7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b"} Oct 03 10:03:50 crc kubenswrapper[4990]: I1003 10:03:50.048150 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerStarted","Data":"ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8"} Oct 03 10:03:51 crc kubenswrapper[4990]: I1003 10:03:51.060684 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerStarted","Data":"11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e"} Oct 03 10:03:51 crc kubenswrapper[4990]: I1003 10:03:51.061014 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 10:03:51 crc kubenswrapper[4990]: I1003 10:03:51.089017 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.226986034 podStartE2EDuration="14.088992567s" podCreationTimestamp="2025-10-03 10:03:37 +0000 UTC" firstStartedPulling="2025-10-03 10:03:38.788091043 +0000 UTC m=+1200.584722900" lastFinishedPulling="2025-10-03 10:03:50.650097576 +0000 UTC m=+1212.446729433" observedRunningTime="2025-10-03 10:03:51.084769601 +0000 UTC m=+1212.881401458" watchObservedRunningTime="2025-10-03 10:03:51.088992567 +0000 UTC m=+1212.885624464" Oct 03 10:04:00 crc kubenswrapper[4990]: I1003 10:04:00.152505 4990 generic.go:334] "Generic (PLEG): container finished" podID="bbf25005-f050-4d64-bbeb-19faa36c9d43" containerID="73e5d97a45dc912b3b85601291d4c5afec6117cd29fbe2b1bc76a7237b673460" exitCode=0 Oct 03 10:04:00 crc kubenswrapper[4990]: I1003 10:04:00.152617 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p67ht" event={"ID":"bbf25005-f050-4d64-bbeb-19faa36c9d43","Type":"ContainerDied","Data":"73e5d97a45dc912b3b85601291d4c5afec6117cd29fbe2b1bc76a7237b673460"} Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.523157 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.629307 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5zf\" (UniqueName: \"kubernetes.io/projected/bbf25005-f050-4d64-bbeb-19faa36c9d43-kube-api-access-lc5zf\") pod \"bbf25005-f050-4d64-bbeb-19faa36c9d43\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.629395 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-scripts\") pod \"bbf25005-f050-4d64-bbeb-19faa36c9d43\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.629485 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-combined-ca-bundle\") pod \"bbf25005-f050-4d64-bbeb-19faa36c9d43\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.629562 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-config-data\") pod \"bbf25005-f050-4d64-bbeb-19faa36c9d43\" (UID: \"bbf25005-f050-4d64-bbeb-19faa36c9d43\") " Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.634723 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-scripts" (OuterVolumeSpecName: "scripts") pod "bbf25005-f050-4d64-bbeb-19faa36c9d43" (UID: "bbf25005-f050-4d64-bbeb-19faa36c9d43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.637416 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf25005-f050-4d64-bbeb-19faa36c9d43-kube-api-access-lc5zf" (OuterVolumeSpecName: "kube-api-access-lc5zf") pod "bbf25005-f050-4d64-bbeb-19faa36c9d43" (UID: "bbf25005-f050-4d64-bbeb-19faa36c9d43"). InnerVolumeSpecName "kube-api-access-lc5zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.661068 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-config-data" (OuterVolumeSpecName: "config-data") pod "bbf25005-f050-4d64-bbeb-19faa36c9d43" (UID: "bbf25005-f050-4d64-bbeb-19faa36c9d43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.666739 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbf25005-f050-4d64-bbeb-19faa36c9d43" (UID: "bbf25005-f050-4d64-bbeb-19faa36c9d43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.731682 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5zf\" (UniqueName: \"kubernetes.io/projected/bbf25005-f050-4d64-bbeb-19faa36c9d43-kube-api-access-lc5zf\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.731720 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.731736 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:01 crc kubenswrapper[4990]: I1003 10:04:01.731747 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf25005-f050-4d64-bbeb-19faa36c9d43-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.174687 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p67ht" event={"ID":"bbf25005-f050-4d64-bbeb-19faa36c9d43","Type":"ContainerDied","Data":"2fa089e6bd20fec9432dc9ab555bfe077cdffbbc1dfbc30086d10d6766fb0e6f"} Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.174966 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa089e6bd20fec9432dc9ab555bfe077cdffbbc1dfbc30086d10d6766fb0e6f" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.174737 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p67ht" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.264282 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 10:04:02 crc kubenswrapper[4990]: E1003 10:04:02.264820 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf25005-f050-4d64-bbeb-19faa36c9d43" containerName="nova-cell0-conductor-db-sync" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.264848 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf25005-f050-4d64-bbeb-19faa36c9d43" containerName="nova-cell0-conductor-db-sync" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.265101 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf25005-f050-4d64-bbeb-19faa36c9d43" containerName="nova-cell0-conductor-db-sync" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.265813 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.268916 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.269352 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mp8qd" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.275104 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.341767 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.341925 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nfkx\" (UniqueName: \"kubernetes.io/projected/85167add-e116-4e56-950b-fe0a6a553732-kube-api-access-5nfkx\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.341959 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.443670 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nfkx\" (UniqueName: \"kubernetes.io/projected/85167add-e116-4e56-950b-fe0a6a553732-kube-api-access-5nfkx\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.443722 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.443787 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.448492 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.457898 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.463708 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nfkx\" (UniqueName: \"kubernetes.io/projected/85167add-e116-4e56-950b-fe0a6a553732-kube-api-access-5nfkx\") pod \"nova-cell0-conductor-0\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:02 crc kubenswrapper[4990]: I1003 10:04:02.584219 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:03 crc kubenswrapper[4990]: I1003 10:04:03.015177 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 10:04:03 crc kubenswrapper[4990]: I1003 10:04:03.184156 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"85167add-e116-4e56-950b-fe0a6a553732","Type":"ContainerStarted","Data":"b0dc24860cca9dcf1728b502d556e1fd044b3a4223fd1b3ee4e4491f0a3bd6fd"} Oct 03 10:04:04 crc kubenswrapper[4990]: I1003 10:04:04.196266 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"85167add-e116-4e56-950b-fe0a6a553732","Type":"ContainerStarted","Data":"f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e"} Oct 03 10:04:04 crc kubenswrapper[4990]: I1003 10:04:04.196635 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:04 crc kubenswrapper[4990]: I1003 10:04:04.219298 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.219281065 podStartE2EDuration="2.219281065s" podCreationTimestamp="2025-10-03 10:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:04.211362486 +0000 UTC m=+1226.007994343" watchObservedRunningTime="2025-10-03 10:04:04.219281065 +0000 UTC m=+1226.015912922" Oct 03 10:04:08 crc kubenswrapper[4990]: I1003 10:04:08.300573 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 10:04:12 crc kubenswrapper[4990]: I1003 10:04:12.613873 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 10:04:12 crc kubenswrapper[4990]: I1003 10:04:12.654260 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:04:12 crc kubenswrapper[4990]: I1003 10:04:12.654473 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="78a0dde5-4d0a-49a7-b9b1-081a994a41da" containerName="kube-state-metrics" containerID="cri-o://0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108" gracePeriod=30 Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.158219 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.233305 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xq2vm"] Oct 03 10:04:13 crc kubenswrapper[4990]: E1003 10:04:13.233831 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a0dde5-4d0a-49a7-b9b1-081a994a41da" containerName="kube-state-metrics" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.233853 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a0dde5-4d0a-49a7-b9b1-081a994a41da" containerName="kube-state-metrics" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.234264 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a0dde5-4d0a-49a7-b9b1-081a994a41da" containerName="kube-state-metrics" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.235027 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.239310 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.239520 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.242094 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msrkd\" (UniqueName: \"kubernetes.io/projected/78a0dde5-4d0a-49a7-b9b1-081a994a41da-kube-api-access-msrkd\") pod \"78a0dde5-4d0a-49a7-b9b1-081a994a41da\" (UID: \"78a0dde5-4d0a-49a7-b9b1-081a994a41da\") " Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.253263 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xq2vm"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.260872 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a0dde5-4d0a-49a7-b9b1-081a994a41da-kube-api-access-msrkd" (OuterVolumeSpecName: "kube-api-access-msrkd") pod "78a0dde5-4d0a-49a7-b9b1-081a994a41da" (UID: "78a0dde5-4d0a-49a7-b9b1-081a994a41da"). InnerVolumeSpecName "kube-api-access-msrkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.280572 4990 generic.go:334] "Generic (PLEG): container finished" podID="78a0dde5-4d0a-49a7-b9b1-081a994a41da" containerID="0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108" exitCode=2 Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.280611 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"78a0dde5-4d0a-49a7-b9b1-081a994a41da","Type":"ContainerDied","Data":"0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108"} Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.280637 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"78a0dde5-4d0a-49a7-b9b1-081a994a41da","Type":"ContainerDied","Data":"1d9dd5c39ce872d5fb5a138548b1524aa4b99ee45492e2756bfb4822f56237e5"} Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.280654 4990 scope.go:117] "RemoveContainer" containerID="0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.280757 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.359566 4990 scope.go:117] "RemoveContainer" containerID="0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.360176 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-config-data\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.360344 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.360398 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tbr\" (UniqueName: \"kubernetes.io/projected/2ccfb8f0-6106-4c88-87eb-78e0e8481518-kube-api-access-79tbr\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.360555 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-scripts\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: E1003 10:04:13.360630 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108\": container with ID starting with 0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108 not found: ID does not exist" containerID="0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.360690 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108"} err="failed to get container status \"0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108\": rpc error: code = NotFound desc = could not find container \"0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108\": container with ID starting with 0462aafec2082f8a8472b50ffbbbab66ce3ffd1c4bcb138e547ce4a17c2c1108 not found: ID does not exist" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.360794 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msrkd\" (UniqueName: \"kubernetes.io/projected/78a0dde5-4d0a-49a7-b9b1-081a994a41da-kube-api-access-msrkd\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.426537 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.461150 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.462371 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-config-data\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.462482 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.462517 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79tbr\" (UniqueName: \"kubernetes.io/projected/2ccfb8f0-6106-4c88-87eb-78e0e8481518-kube-api-access-79tbr\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.462577 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-scripts\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.473795 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-scripts\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.478137 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-config-data\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.482535 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.495164 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.497295 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.498489 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tbr\" (UniqueName: \"kubernetes.io/projected/2ccfb8f0-6106-4c88-87eb-78e0e8481518-kube-api-access-79tbr\") pod \"nova-cell0-cell-mapping-xq2vm\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.505038 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.519713 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.533094 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.543106 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.546011 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.555150 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.555315 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.565903 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.567382 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.571721 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.583859 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.618045 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.619686 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.621114 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.628004 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.628859 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.641117 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.643163 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.649642 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.667113 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.667279 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.667321 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6pz\" (UniqueName: \"kubernetes.io/projected/56fcd909-29b5-472d-8007-84fc511ac818-kube-api-access-nz6pz\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.667437 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.667529 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.667550 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.667587 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2n82\" (UniqueName: \"kubernetes.io/projected/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-kube-api-access-w2n82\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.726682 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.768853 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvgq6\" (UniqueName: \"kubernetes.io/projected/8db4914f-89b7-410f-b10a-40e6c1b7a921-kube-api-access-dvgq6\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.768894 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.768924 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db4914f-89b7-410f-b10a-40e6c1b7a921-logs\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.768962 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.768998 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769030 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6pz\" (UniqueName: \"kubernetes.io/projected/56fcd909-29b5-472d-8007-84fc511ac818-kube-api-access-nz6pz\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769054 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769076 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svks9\" (UniqueName: \"kubernetes.io/projected/e812c1d8-392d-4ae9-bcc0-e836ae5df257-kube-api-access-svks9\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769108 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6jxv\" (UniqueName: \"kubernetes.io/projected/8b02d939-971c-4e77-8ca6-ff6a535e207b-kube-api-access-x6jxv\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769128 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769161 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-config-data\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769182 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e812c1d8-392d-4ae9-bcc0-e836ae5df257-logs\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769224 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769258 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769279 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-config-data\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769298 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769328 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2n82\" (UniqueName: \"kubernetes.io/projected/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-kube-api-access-w2n82\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.769359 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-config-data\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.773534 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.778140 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.778255 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.779959 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.787078 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.817164 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2n82\" (UniqueName: \"kubernetes.io/projected/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-kube-api-access-w2n82\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.817718 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6pz\" (UniqueName: \"kubernetes.io/projected/56fcd909-29b5-472d-8007-84fc511ac818-kube-api-access-nz6pz\") pod \"kube-state-metrics-0\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.818703 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5577d7975c-6nzt4"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.820770 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.854165 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5577d7975c-6nzt4"] Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.886922 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888619 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvgq6\" (UniqueName: \"kubernetes.io/projected/8db4914f-89b7-410f-b10a-40e6c1b7a921-kube-api-access-dvgq6\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888655 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888682 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db4914f-89b7-410f-b10a-40e6c1b7a921-logs\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888778 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svks9\" (UniqueName: \"kubernetes.io/projected/e812c1d8-392d-4ae9-bcc0-e836ae5df257-kube-api-access-svks9\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888814 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6jxv\" (UniqueName: \"kubernetes.io/projected/8b02d939-971c-4e77-8ca6-ff6a535e207b-kube-api-access-x6jxv\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888837 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888882 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-config-data\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888925 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e812c1d8-392d-4ae9-bcc0-e836ae5df257-logs\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888962 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-config-data\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.888982 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.889025 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-config-data\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.895786 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.910168 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db4914f-89b7-410f-b10a-40e6c1b7a921-logs\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.910716 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e812c1d8-392d-4ae9-bcc0-e836ae5df257-logs\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.916065 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.924276 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-config-data\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.933260 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-config-data\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.963246 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.963464 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-config-data\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.963703 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.969279 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvgq6\" (UniqueName: \"kubernetes.io/projected/8db4914f-89b7-410f-b10a-40e6c1b7a921-kube-api-access-dvgq6\") pod \"nova-api-0\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " pod="openstack/nova-api-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.979463 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svks9\" (UniqueName: \"kubernetes.io/projected/e812c1d8-392d-4ae9-bcc0-e836ae5df257-kube-api-access-svks9\") pod \"nova-metadata-0\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " pod="openstack/nova-metadata-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.979981 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6jxv\" (UniqueName: \"kubernetes.io/projected/8b02d939-971c-4e77-8ca6-ff6a535e207b-kube-api-access-x6jxv\") pod \"nova-scheduler-0\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:13 crc kubenswrapper[4990]: I1003 10:04:13.996009 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.003540 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48sx\" (UniqueName: \"kubernetes.io/projected/c2ec452a-525c-458f-b876-dbef8ff507f4-kube-api-access-s48sx\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.003745 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.003867 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.004009 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-svc\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.004173 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-config\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.004227 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.106018 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.106079 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.106126 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-svc\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.106198 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-config\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.106231 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.106270 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48sx\" (UniqueName: \"kubernetes.io/projected/c2ec452a-525c-458f-b876-dbef8ff507f4-kube-api-access-s48sx\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.106945 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-swift-storage-0\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.107357 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-config\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.107376 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.119298 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-sb\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.131598 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-svc\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.137346 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48sx\" (UniqueName: \"kubernetes.io/projected/c2ec452a-525c-458f-b876-dbef8ff507f4-kube-api-access-s48sx\") pod \"dnsmasq-dns-5577d7975c-6nzt4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.207143 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.262383 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.401162 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.403040 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xq2vm"] Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.444348 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.702905 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.738832 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-67mbn"] Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.740341 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.744467 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.744692 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.760524 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-67mbn"] Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.825381 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-config-data\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.825562 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.825616 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-scripts\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.825792 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tm4w\" (UniqueName: \"kubernetes.io/projected/d21cb3a0-498d-4edf-b411-8a13ae88e221-kube-api-access-9tm4w\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.861529 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.902889 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a0dde5-4d0a-49a7-b9b1-081a994a41da" path="/var/lib/kubelet/pods/78a0dde5-4d0a-49a7-b9b1-081a994a41da/volumes" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.928401 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-config-data\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.928493 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.928554 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-scripts\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.928654 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tm4w\" (UniqueName: \"kubernetes.io/projected/d21cb3a0-498d-4edf-b411-8a13ae88e221-kube-api-access-9tm4w\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.941395 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-scripts\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.944490 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.963173 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-config-data\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.969945 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:14 crc kubenswrapper[4990]: I1003 10:04:14.991068 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tm4w\" (UniqueName: \"kubernetes.io/projected/d21cb3a0-498d-4edf-b411-8a13ae88e221-kube-api-access-9tm4w\") pod \"nova-cell1-conductor-db-sync-67mbn\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.102998 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5577d7975c-6nzt4"] Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.116247 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.135797 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.314283 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b02d939-971c-4e77-8ca6-ff6a535e207b","Type":"ContainerStarted","Data":"b532fae658886f446e3e7fc5491580b33e47a82e925e85fb3a0c92fda9781b19"} Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.321504 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e","Type":"ContainerStarted","Data":"4f0e338210e994a86fff92deaf3ed83ad54e6c94bb58ef639fca6825c949df3a"} Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.324469 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" event={"ID":"c2ec452a-525c-458f-b876-dbef8ff507f4","Type":"ContainerStarted","Data":"a1c08ab9ae75d4710dfcfb5af700b63cd3ecfea691eaa6c3ba960531194bf1ad"} Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.339560 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56fcd909-29b5-472d-8007-84fc511ac818","Type":"ContainerStarted","Data":"c77b889243516ebf608418e7d11a97d534c9b1388e38a9b0f3b4e0845ab77456"} Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.350192 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xq2vm" event={"ID":"2ccfb8f0-6106-4c88-87eb-78e0e8481518","Type":"ContainerStarted","Data":"56593d82c59221b08ee85182fb62a7183bf2c14dede1dee8ab760ea79c4fb46a"} Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.350242 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xq2vm" event={"ID":"2ccfb8f0-6106-4c88-87eb-78e0e8481518","Type":"ContainerStarted","Data":"545df75da903a61da81df28a4ba137714fd1d053c8a49265a524d014758e0c33"} Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.366209 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8db4914f-89b7-410f-b10a-40e6c1b7a921","Type":"ContainerStarted","Data":"d7692d75d198198ff851d20eea5e7a3b8e2979f1f2ac5900f4fb69fd7a1b4403"} Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.371710 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e812c1d8-392d-4ae9-bcc0-e836ae5df257","Type":"ContainerStarted","Data":"0a944432a670ca83466e402e3edcfffb6b9366c9f967f8c7d99399a0a9a60aef"} Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.407452 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xq2vm" podStartSLOduration=2.407433389 podStartE2EDuration="2.407433389s" podCreationTimestamp="2025-10-03 10:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:15.404634978 +0000 UTC m=+1237.201266855" watchObservedRunningTime="2025-10-03 10:04:15.407433389 +0000 UTC m=+1237.204065246" Oct 03 10:04:15 crc kubenswrapper[4990]: I1003 10:04:15.668167 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-67mbn"] Oct 03 10:04:15 crc kubenswrapper[4990]: W1003 10:04:15.671615 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd21cb3a0_498d_4edf_b411_8a13ae88e221.slice/crio-253f54911801ef6c632e01a3e6840ff3afcc08898f7cc7d8b9f8af09354a224b WatchSource:0}: Error finding container 253f54911801ef6c632e01a3e6840ff3afcc08898f7cc7d8b9f8af09354a224b: Status 404 returned error can't find the container with id 253f54911801ef6c632e01a3e6840ff3afcc08898f7cc7d8b9f8af09354a224b Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.198376 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.199083 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="ceilometer-central-agent" containerID="cri-o://c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0" gracePeriod=30 Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.199220 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="proxy-httpd" containerID="cri-o://11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e" gracePeriod=30 Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.199290 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="sg-core" containerID="cri-o://ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8" gracePeriod=30 Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.199337 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="ceilometer-notification-agent" containerID="cri-o://7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b" gracePeriod=30 Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.389348 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56fcd909-29b5-472d-8007-84fc511ac818","Type":"ContainerStarted","Data":"31042aa385dfd4cdb15ed74ad929e7230170e4b65fe953249dedab85ce6fb90d"} Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.390328 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.395586 4990 generic.go:334] "Generic (PLEG): container finished" podID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerID="ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8" exitCode=2 Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.395710 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerDied","Data":"ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8"} Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.402641 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-67mbn" event={"ID":"d21cb3a0-498d-4edf-b411-8a13ae88e221","Type":"ContainerStarted","Data":"72d2eb3b6894a8014c179dca4994554aac84f2a81598f222f9283646a4f6d6e6"} Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.402690 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-67mbn" event={"ID":"d21cb3a0-498d-4edf-b411-8a13ae88e221","Type":"ContainerStarted","Data":"253f54911801ef6c632e01a3e6840ff3afcc08898f7cc7d8b9f8af09354a224b"} Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.414236 4990 generic.go:334] "Generic (PLEG): container finished" podID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerID="9cdcd500d5975a53eb34c68565bee9aecf2028ebc08b6a9bd8bf5b593c18a194" exitCode=0 Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.415450 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" event={"ID":"c2ec452a-525c-458f-b876-dbef8ff507f4","Type":"ContainerDied","Data":"9cdcd500d5975a53eb34c68565bee9aecf2028ebc08b6a9bd8bf5b593c18a194"} Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.431501 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.643088816 podStartE2EDuration="3.431482551s" podCreationTimestamp="2025-10-03 10:04:13 +0000 UTC" firstStartedPulling="2025-10-03 10:04:14.490247138 +0000 UTC m=+1236.286878985" lastFinishedPulling="2025-10-03 10:04:15.278640863 +0000 UTC m=+1237.075272720" observedRunningTime="2025-10-03 10:04:16.407690357 +0000 UTC m=+1238.204322234" watchObservedRunningTime="2025-10-03 10:04:16.431482551 +0000 UTC m=+1238.228114408" Oct 03 10:04:16 crc kubenswrapper[4990]: I1003 10:04:16.460608 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-67mbn" podStartSLOduration=2.460583404 podStartE2EDuration="2.460583404s" podCreationTimestamp="2025-10-03 10:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:16.422445657 +0000 UTC m=+1238.219077504" watchObservedRunningTime="2025-10-03 10:04:16.460583404 +0000 UTC m=+1238.257215261" Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.309603 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.328221 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.437578 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" event={"ID":"c2ec452a-525c-458f-b876-dbef8ff507f4","Type":"ContainerStarted","Data":"24992578c77dff6b57091b7e6c8588c10de059bad8e64877ece6521dbc3ef205"} Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.438067 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.452719 4990 generic.go:334] "Generic (PLEG): container finished" podID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerID="11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e" exitCode=0 Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.452770 4990 generic.go:334] "Generic (PLEG): container finished" podID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerID="c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0" exitCode=0 Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.452979 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerDied","Data":"11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e"} Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.453031 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerDied","Data":"c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0"} Oct 03 10:04:17 crc kubenswrapper[4990]: I1003 10:04:17.463115 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" podStartSLOduration=4.463096526 podStartE2EDuration="4.463096526s" podCreationTimestamp="2025-10-03 10:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:17.460964994 +0000 UTC m=+1239.257596881" watchObservedRunningTime="2025-10-03 10:04:17.463096526 +0000 UTC m=+1239.259728383" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.508733 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.527447 4990 generic.go:334] "Generic (PLEG): container finished" podID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerID="7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b" exitCode=0 Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.527648 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerDied","Data":"7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b"} Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.527767 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b504d2cf-6136-4df4-adca-e233c14dc47a","Type":"ContainerDied","Data":"4efde896e24e291c52f408ff4563481723c2303a3b819a5c9c9538eb499b61f5"} Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.527835 4990 scope.go:117] "RemoveContainer" containerID="11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.589956 4990 scope.go:117] "RemoveContainer" containerID="ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.619651 4990 scope.go:117] "RemoveContainer" containerID="7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.665224 4990 scope.go:117] "RemoveContainer" containerID="c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.691105 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-log-httpd\") pod \"b504d2cf-6136-4df4-adca-e233c14dc47a\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.691189 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-sg-core-conf-yaml\") pod \"b504d2cf-6136-4df4-adca-e233c14dc47a\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.691244 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-config-data\") pod \"b504d2cf-6136-4df4-adca-e233c14dc47a\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.691303 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-combined-ca-bundle\") pod \"b504d2cf-6136-4df4-adca-e233c14dc47a\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.691321 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-run-httpd\") pod \"b504d2cf-6136-4df4-adca-e233c14dc47a\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.691372 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zj4h\" (UniqueName: \"kubernetes.io/projected/b504d2cf-6136-4df4-adca-e233c14dc47a-kube-api-access-6zj4h\") pod \"b504d2cf-6136-4df4-adca-e233c14dc47a\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.691443 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-scripts\") pod \"b504d2cf-6136-4df4-adca-e233c14dc47a\" (UID: \"b504d2cf-6136-4df4-adca-e233c14dc47a\") " Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.692071 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b504d2cf-6136-4df4-adca-e233c14dc47a" (UID: "b504d2cf-6136-4df4-adca-e233c14dc47a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.692448 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b504d2cf-6136-4df4-adca-e233c14dc47a" (UID: "b504d2cf-6136-4df4-adca-e233c14dc47a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.700321 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b504d2cf-6136-4df4-adca-e233c14dc47a-kube-api-access-6zj4h" (OuterVolumeSpecName: "kube-api-access-6zj4h") pod "b504d2cf-6136-4df4-adca-e233c14dc47a" (UID: "b504d2cf-6136-4df4-adca-e233c14dc47a"). InnerVolumeSpecName "kube-api-access-6zj4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.706561 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-scripts" (OuterVolumeSpecName: "scripts") pod "b504d2cf-6136-4df4-adca-e233c14dc47a" (UID: "b504d2cf-6136-4df4-adca-e233c14dc47a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.715265 4990 scope.go:117] "RemoveContainer" containerID="11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e" Oct 03 10:04:21 crc kubenswrapper[4990]: E1003 10:04:21.720049 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e\": container with ID starting with 11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e not found: ID does not exist" containerID="11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.720092 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e"} err="failed to get container status \"11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e\": rpc error: code = NotFound desc = could not find container \"11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e\": container with ID starting with 11f0e3ccecbdaa6cf2384debc2885b76da511faf5a7d00e9873a4a658b3de88e not found: ID does not exist" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.720120 4990 scope.go:117] "RemoveContainer" containerID="ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8" Oct 03 10:04:21 crc kubenswrapper[4990]: E1003 10:04:21.723067 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8\": container with ID starting with ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8 not found: ID does not exist" containerID="ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.723094 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8"} err="failed to get container status \"ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8\": rpc error: code = NotFound desc = could not find container \"ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8\": container with ID starting with ae75f7622982c1ff71cf36efad44184181794f099d666ba6c4dccde956181be8 not found: ID does not exist" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.723110 4990 scope.go:117] "RemoveContainer" containerID="7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b" Oct 03 10:04:21 crc kubenswrapper[4990]: E1003 10:04:21.729426 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b\": container with ID starting with 7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b not found: ID does not exist" containerID="7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.729455 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b"} err="failed to get container status \"7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b\": rpc error: code = NotFound desc = could not find container \"7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b\": container with ID starting with 7c43cc8c1c9ffc8838954e9a38f5615fd1038a17b8979ca87d4e8f9c99e5ce5b not found: ID does not exist" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.729471 4990 scope.go:117] "RemoveContainer" containerID="c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0" Oct 03 10:04:21 crc kubenswrapper[4990]: E1003 10:04:21.729853 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0\": container with ID starting with c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0 not found: ID does not exist" containerID="c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.729876 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0"} err="failed to get container status \"c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0\": rpc error: code = NotFound desc = could not find container \"c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0\": container with ID starting with c456f18110af38eefda7ab82817389316c715345db9f8a3af3cade688e793be0 not found: ID does not exist" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.747625 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b504d2cf-6136-4df4-adca-e233c14dc47a" (UID: "b504d2cf-6136-4df4-adca-e233c14dc47a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.793825 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.793860 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.793872 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.793886 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b504d2cf-6136-4df4-adca-e233c14dc47a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.793896 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zj4h\" (UniqueName: \"kubernetes.io/projected/b504d2cf-6136-4df4-adca-e233c14dc47a-kube-api-access-6zj4h\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.813552 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-config-data" (OuterVolumeSpecName: "config-data") pod "b504d2cf-6136-4df4-adca-e233c14dc47a" (UID: "b504d2cf-6136-4df4-adca-e233c14dc47a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.822657 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b504d2cf-6136-4df4-adca-e233c14dc47a" (UID: "b504d2cf-6136-4df4-adca-e233c14dc47a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.895789 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:21 crc kubenswrapper[4990]: I1003 10:04:21.896140 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b504d2cf-6136-4df4-adca-e233c14dc47a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.537426 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b02d939-971c-4e77-8ca6-ff6a535e207b","Type":"ContainerStarted","Data":"17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80"} Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.539628 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e","Type":"ContainerStarted","Data":"08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182"} Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.539719 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182" gracePeriod=30 Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.540911 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.543896 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8db4914f-89b7-410f-b10a-40e6c1b7a921","Type":"ContainerStarted","Data":"2820731a6f9d16c8bcb30e99bdc2a985f9a42273b228511e6077d428aa4968fc"} Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.543944 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8db4914f-89b7-410f-b10a-40e6c1b7a921","Type":"ContainerStarted","Data":"a3ff65db3e039fe244fb39ebe85806f1e5811322d248180fad53bc31a7d05cb4"} Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.547015 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e812c1d8-392d-4ae9-bcc0-e836ae5df257","Type":"ContainerStarted","Data":"79dec94b6e58d8380b6d01bcd8fdb5df43c7ec5d01ecd5e932a105271f2befff"} Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.547044 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e812c1d8-392d-4ae9-bcc0-e836ae5df257","Type":"ContainerStarted","Data":"347d67bb12ce3e5e6011f5c6fe19efe768e9510a39847f268875bc641c8498fe"} Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.547169 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerName="nova-metadata-log" containerID="cri-o://347d67bb12ce3e5e6011f5c6fe19efe768e9510a39847f268875bc641c8498fe" gracePeriod=30 Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.547288 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerName="nova-metadata-metadata" containerID="cri-o://79dec94b6e58d8380b6d01bcd8fdb5df43c7ec5d01ecd5e932a105271f2befff" gracePeriod=30 Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.570254 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.444906439 podStartE2EDuration="9.570231987s" podCreationTimestamp="2025-10-03 10:04:13 +0000 UTC" firstStartedPulling="2025-10-03 10:04:14.98690108 +0000 UTC m=+1236.783532927" lastFinishedPulling="2025-10-03 10:04:21.112226618 +0000 UTC m=+1242.908858475" observedRunningTime="2025-10-03 10:04:22.566334879 +0000 UTC m=+1244.362966736" watchObservedRunningTime="2025-10-03 10:04:22.570231987 +0000 UTC m=+1244.366863844" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.611293 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.396473089 podStartE2EDuration="9.611279458s" podCreationTimestamp="2025-10-03 10:04:13 +0000 UTC" firstStartedPulling="2025-10-03 10:04:14.860339247 +0000 UTC m=+1236.656971104" lastFinishedPulling="2025-10-03 10:04:21.075145616 +0000 UTC m=+1242.871777473" observedRunningTime="2025-10-03 10:04:22.610629809 +0000 UTC m=+1244.407261676" watchObservedRunningTime="2025-10-03 10:04:22.611279458 +0000 UTC m=+1244.407911315" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.639112 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.653242 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.663199 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.29168834 podStartE2EDuration="9.663174701s" podCreationTimestamp="2025-10-03 10:04:13 +0000 UTC" firstStartedPulling="2025-10-03 10:04:14.729775823 +0000 UTC m=+1236.526407680" lastFinishedPulling="2025-10-03 10:04:21.101262184 +0000 UTC m=+1242.897894041" observedRunningTime="2025-10-03 10:04:22.649200153 +0000 UTC m=+1244.445832000" watchObservedRunningTime="2025-10-03 10:04:22.663174701 +0000 UTC m=+1244.459806558" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.690840 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:22 crc kubenswrapper[4990]: E1003 10:04:22.691361 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="ceilometer-central-agent" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.691375 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="ceilometer-central-agent" Oct 03 10:04:22 crc kubenswrapper[4990]: E1003 10:04:22.691390 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="sg-core" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.691396 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="sg-core" Oct 03 10:04:22 crc kubenswrapper[4990]: E1003 10:04:22.691411 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="proxy-httpd" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.691417 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="proxy-httpd" Oct 03 10:04:22 crc kubenswrapper[4990]: E1003 10:04:22.691444 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="ceilometer-notification-agent" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.691456 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="ceilometer-notification-agent" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.691704 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="sg-core" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.691726 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="proxy-httpd" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.691740 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="ceilometer-notification-agent" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.691753 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" containerName="ceilometer-central-agent" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.693601 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.699225 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.699463 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.699609 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.707249 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.727201382 podStartE2EDuration="9.707225506s" podCreationTimestamp="2025-10-03 10:04:13 +0000 UTC" firstStartedPulling="2025-10-03 10:04:15.10442022 +0000 UTC m=+1236.901052077" lastFinishedPulling="2025-10-03 10:04:21.084444344 +0000 UTC m=+1242.881076201" observedRunningTime="2025-10-03 10:04:22.691233958 +0000 UTC m=+1244.487865815" watchObservedRunningTime="2025-10-03 10:04:22.707225506 +0000 UTC m=+1244.503857363" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.707572 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.818096 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-config-data\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.818156 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.818199 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-scripts\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.818241 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjbv\" (UniqueName: \"kubernetes.io/projected/ff8a0fbd-0f1a-4917-8467-190e72695912-kube-api-access-bjjbv\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.818260 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.818275 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-log-httpd\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.818355 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.818525 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-run-httpd\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.883307 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b504d2cf-6136-4df4-adca-e233c14dc47a" path="/var/lib/kubelet/pods/b504d2cf-6136-4df4-adca-e233c14dc47a/volumes" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.920366 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-config-data\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.920458 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.920493 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-scripts\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.920634 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjbv\" (UniqueName: \"kubernetes.io/projected/ff8a0fbd-0f1a-4917-8467-190e72695912-kube-api-access-bjjbv\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.920662 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.920688 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-log-httpd\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.920741 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.920798 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-run-httpd\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.921287 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-run-httpd\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.921550 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-log-httpd\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.926218 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.926471 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.928317 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.928560 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-scripts\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.950188 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-config-data\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:22 crc kubenswrapper[4990]: I1003 10:04:22.953114 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjbv\" (UniqueName: \"kubernetes.io/projected/ff8a0fbd-0f1a-4917-8467-190e72695912-kube-api-access-bjjbv\") pod \"ceilometer-0\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " pod="openstack/ceilometer-0" Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.043264 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.579581 4990 generic.go:334] "Generic (PLEG): container finished" podID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerID="79dec94b6e58d8380b6d01bcd8fdb5df43c7ec5d01ecd5e932a105271f2befff" exitCode=0 Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.579947 4990 generic.go:334] "Generic (PLEG): container finished" podID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerID="347d67bb12ce3e5e6011f5c6fe19efe768e9510a39847f268875bc641c8498fe" exitCode=143 Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.581433 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e812c1d8-392d-4ae9-bcc0-e836ae5df257","Type":"ContainerDied","Data":"79dec94b6e58d8380b6d01bcd8fdb5df43c7ec5d01ecd5e932a105271f2befff"} Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.581474 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e812c1d8-392d-4ae9-bcc0-e836ae5df257","Type":"ContainerDied","Data":"347d67bb12ce3e5e6011f5c6fe19efe768e9510a39847f268875bc641c8498fe"} Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.645782 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:23 crc kubenswrapper[4990]: W1003 10:04:23.658306 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8a0fbd_0f1a_4917_8467_190e72695912.slice/crio-861e0a5bd75bb41ca37aca1bc0116c922699b7803320543bba0bb7620d92a5a6 WatchSource:0}: Error finding container 861e0a5bd75bb41ca37aca1bc0116c922699b7803320543bba0bb7620d92a5a6: Status 404 returned error can't find the container with id 861e0a5bd75bb41ca37aca1bc0116c922699b7803320543bba0bb7620d92a5a6 Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.750160 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.906165 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.910654 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.950978 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e812c1d8-392d-4ae9-bcc0-e836ae5df257-logs\") pod \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.951325 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e812c1d8-392d-4ae9-bcc0-e836ae5df257-logs" (OuterVolumeSpecName: "logs") pod "e812c1d8-392d-4ae9-bcc0-e836ae5df257" (UID: "e812c1d8-392d-4ae9-bcc0-e836ae5df257"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.951388 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-config-data\") pod \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.951466 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-combined-ca-bundle\") pod \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.951671 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svks9\" (UniqueName: \"kubernetes.io/projected/e812c1d8-392d-4ae9-bcc0-e836ae5df257-kube-api-access-svks9\") pod \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\" (UID: \"e812c1d8-392d-4ae9-bcc0-e836ae5df257\") " Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.952628 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e812c1d8-392d-4ae9-bcc0-e836ae5df257-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.967004 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e812c1d8-392d-4ae9-bcc0-e836ae5df257-kube-api-access-svks9" (OuterVolumeSpecName: "kube-api-access-svks9") pod "e812c1d8-392d-4ae9-bcc0-e836ae5df257" (UID: "e812c1d8-392d-4ae9-bcc0-e836ae5df257"). InnerVolumeSpecName "kube-api-access-svks9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:23 crc kubenswrapper[4990]: I1003 10:04:23.994063 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e812c1d8-392d-4ae9-bcc0-e836ae5df257" (UID: "e812c1d8-392d-4ae9-bcc0-e836ae5df257"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.018126 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-config-data" (OuterVolumeSpecName: "config-data") pod "e812c1d8-392d-4ae9-bcc0-e836ae5df257" (UID: "e812c1d8-392d-4ae9-bcc0-e836ae5df257"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.054952 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svks9\" (UniqueName: \"kubernetes.io/projected/e812c1d8-392d-4ae9-bcc0-e836ae5df257-kube-api-access-svks9\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.055144 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.055230 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e812c1d8-392d-4ae9-bcc0-e836ae5df257-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.208295 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.208330 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.246448 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.263920 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.264156 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.402940 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.489984 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f55c55b8f-6gtww"] Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.490623 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" podUID="be56917c-d7de-4294-8dfc-97ebddbd3240" containerName="dnsmasq-dns" containerID="cri-o://69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279" gracePeriod=10 Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.593961 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e812c1d8-392d-4ae9-bcc0-e836ae5df257","Type":"ContainerDied","Data":"0a944432a670ca83466e402e3edcfffb6b9366c9f967f8c7d99399a0a9a60aef"} Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.594042 4990 scope.go:117] "RemoveContainer" containerID="79dec94b6e58d8380b6d01bcd8fdb5df43c7ec5d01ecd5e932a105271f2befff" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.593990 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.595936 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerStarted","Data":"bb89622637cbc905e62915a052bd2f9da0291eefe6fa34e3aa8c51ef8d6aeba5"} Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.595961 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerStarted","Data":"861e0a5bd75bb41ca37aca1bc0116c922699b7803320543bba0bb7620d92a5a6"} Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.597203 4990 generic.go:334] "Generic (PLEG): container finished" podID="2ccfb8f0-6106-4c88-87eb-78e0e8481518" containerID="56593d82c59221b08ee85182fb62a7183bf2c14dede1dee8ab760ea79c4fb46a" exitCode=0 Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.598175 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xq2vm" event={"ID":"2ccfb8f0-6106-4c88-87eb-78e0e8481518","Type":"ContainerDied","Data":"56593d82c59221b08ee85182fb62a7183bf2c14dede1dee8ab760ea79c4fb46a"} Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.638061 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.651894 4990 scope.go:117] "RemoveContainer" containerID="347d67bb12ce3e5e6011f5c6fe19efe768e9510a39847f268875bc641c8498fe" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.676022 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.698632 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.711655 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:24 crc kubenswrapper[4990]: E1003 10:04:24.712210 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerName="nova-metadata-metadata" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.712234 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerName="nova-metadata-metadata" Oct 03 10:04:24 crc kubenswrapper[4990]: E1003 10:04:24.712267 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerName="nova-metadata-log" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.712277 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerName="nova-metadata-log" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.712550 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerName="nova-metadata-log" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.712603 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" containerName="nova-metadata-metadata" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.714005 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.726267 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.726459 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.753850 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.881720 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.881851 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.881906 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-config-data\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.881935 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3ee493-6c0a-4519-ab88-6e3093595cba-logs\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.881982 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb4pw\" (UniqueName: \"kubernetes.io/projected/3e3ee493-6c0a-4519-ab88-6e3093595cba-kube-api-access-xb4pw\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.886353 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e812c1d8-392d-4ae9-bcc0-e836ae5df257" path="/var/lib/kubelet/pods/e812c1d8-392d-4ae9-bcc0-e836ae5df257/volumes" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.983046 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-config-data\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.983378 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3ee493-6c0a-4519-ab88-6e3093595cba-logs\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.983433 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb4pw\" (UniqueName: \"kubernetes.io/projected/3e3ee493-6c0a-4519-ab88-6e3093595cba-kube-api-access-xb4pw\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.983504 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.983598 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.984883 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3ee493-6c0a-4519-ab88-6e3093595cba-logs\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.993318 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:24 crc kubenswrapper[4990]: I1003 10:04:24.993757 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-config-data\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.007229 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb4pw\" (UniqueName: \"kubernetes.io/projected/3e3ee493-6c0a-4519-ab88-6e3093595cba-kube-api-access-xb4pw\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.008050 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " pod="openstack/nova-metadata-0" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.053982 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.248382 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.349040 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.349573 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.390937 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-sb\") pod \"be56917c-d7de-4294-8dfc-97ebddbd3240\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.391180 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmb27\" (UniqueName: \"kubernetes.io/projected/be56917c-d7de-4294-8dfc-97ebddbd3240-kube-api-access-pmb27\") pod \"be56917c-d7de-4294-8dfc-97ebddbd3240\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.391281 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-nb\") pod \"be56917c-d7de-4294-8dfc-97ebddbd3240\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.391417 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-svc\") pod \"be56917c-d7de-4294-8dfc-97ebddbd3240\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.391897 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-swift-storage-0\") pod \"be56917c-d7de-4294-8dfc-97ebddbd3240\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.392051 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-config\") pod \"be56917c-d7de-4294-8dfc-97ebddbd3240\" (UID: \"be56917c-d7de-4294-8dfc-97ebddbd3240\") " Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.415643 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be56917c-d7de-4294-8dfc-97ebddbd3240-kube-api-access-pmb27" (OuterVolumeSpecName: "kube-api-access-pmb27") pod "be56917c-d7de-4294-8dfc-97ebddbd3240" (UID: "be56917c-d7de-4294-8dfc-97ebddbd3240"). InnerVolumeSpecName "kube-api-access-pmb27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.484027 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be56917c-d7de-4294-8dfc-97ebddbd3240" (UID: "be56917c-d7de-4294-8dfc-97ebddbd3240"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.498888 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.498928 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmb27\" (UniqueName: \"kubernetes.io/projected/be56917c-d7de-4294-8dfc-97ebddbd3240-kube-api-access-pmb27\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.579560 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "be56917c-d7de-4294-8dfc-97ebddbd3240" (UID: "be56917c-d7de-4294-8dfc-97ebddbd3240"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.586492 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "be56917c-d7de-4294-8dfc-97ebddbd3240" (UID: "be56917c-d7de-4294-8dfc-97ebddbd3240"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.602386 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.602414 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.608924 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be56917c-d7de-4294-8dfc-97ebddbd3240" (UID: "be56917c-d7de-4294-8dfc-97ebddbd3240"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.611086 4990 generic.go:334] "Generic (PLEG): container finished" podID="be56917c-d7de-4294-8dfc-97ebddbd3240" containerID="69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279" exitCode=0 Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.613233 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.613463 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" event={"ID":"be56917c-d7de-4294-8dfc-97ebddbd3240","Type":"ContainerDied","Data":"69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279"} Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.613585 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55c55b8f-6gtww" event={"ID":"be56917c-d7de-4294-8dfc-97ebddbd3240","Type":"ContainerDied","Data":"d1fc92db19f1778a792b14cc742653f7729c6967b1ca89fade83fc549ac9cac3"} Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.614522 4990 scope.go:117] "RemoveContainer" containerID="69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.627630 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-config" (OuterVolumeSpecName: "config") pod "be56917c-d7de-4294-8dfc-97ebddbd3240" (UID: "be56917c-d7de-4294-8dfc-97ebddbd3240"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.657356 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.667637 4990 scope.go:117] "RemoveContainer" containerID="50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.699287 4990 scope.go:117] "RemoveContainer" containerID="69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279" Oct 03 10:04:25 crc kubenswrapper[4990]: E1003 10:04:25.699849 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279\": container with ID starting with 69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279 not found: ID does not exist" containerID="69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.699880 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279"} err="failed to get container status \"69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279\": rpc error: code = NotFound desc = could not find container \"69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279\": container with ID starting with 69313dd5938b633613bca5e07be485d53d61c59b3e71dcd6f31d273376db6279 not found: ID does not exist" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.699903 4990 scope.go:117] "RemoveContainer" containerID="50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6" Oct 03 10:04:25 crc kubenswrapper[4990]: E1003 10:04:25.700265 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6\": container with ID starting with 50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6 not found: ID does not exist" containerID="50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.700308 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6"} err="failed to get container status \"50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6\": rpc error: code = NotFound desc = could not find container \"50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6\": container with ID starting with 50a451b8a35f584b33a66c4e2b43878e1c8419e7ebd913430bce74282d793cb6 not found: ID does not exist" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.704394 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.704419 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be56917c-d7de-4294-8dfc-97ebddbd3240-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.946667 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.965573 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f55c55b8f-6gtww"] Oct 03 10:04:25 crc kubenswrapper[4990]: I1003 10:04:25.982631 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f55c55b8f-6gtww"] Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.111030 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-scripts\") pod \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.111191 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-config-data\") pod \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.111219 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-combined-ca-bundle\") pod \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.111321 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79tbr\" (UniqueName: \"kubernetes.io/projected/2ccfb8f0-6106-4c88-87eb-78e0e8481518-kube-api-access-79tbr\") pod \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\" (UID: \"2ccfb8f0-6106-4c88-87eb-78e0e8481518\") " Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.157437 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-scripts" (OuterVolumeSpecName: "scripts") pod "2ccfb8f0-6106-4c88-87eb-78e0e8481518" (UID: "2ccfb8f0-6106-4c88-87eb-78e0e8481518"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.158613 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ccfb8f0-6106-4c88-87eb-78e0e8481518-kube-api-access-79tbr" (OuterVolumeSpecName: "kube-api-access-79tbr") pod "2ccfb8f0-6106-4c88-87eb-78e0e8481518" (UID: "2ccfb8f0-6106-4c88-87eb-78e0e8481518"). InnerVolumeSpecName "kube-api-access-79tbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.161724 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-config-data" (OuterVolumeSpecName: "config-data") pod "2ccfb8f0-6106-4c88-87eb-78e0e8481518" (UID: "2ccfb8f0-6106-4c88-87eb-78e0e8481518"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.164671 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ccfb8f0-6106-4c88-87eb-78e0e8481518" (UID: "2ccfb8f0-6106-4c88-87eb-78e0e8481518"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.214968 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.215005 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.215016 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79tbr\" (UniqueName: \"kubernetes.io/projected/2ccfb8f0-6106-4c88-87eb-78e0e8481518-kube-api-access-79tbr\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.215025 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccfb8f0-6106-4c88-87eb-78e0e8481518-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.623535 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xq2vm" event={"ID":"2ccfb8f0-6106-4c88-87eb-78e0e8481518","Type":"ContainerDied","Data":"545df75da903a61da81df28a4ba137714fd1d053c8a49265a524d014758e0c33"} Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.623904 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="545df75da903a61da81df28a4ba137714fd1d053c8a49265a524d014758e0c33" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.623869 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xq2vm" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.627305 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerStarted","Data":"996342a50d0cd52c6936d7b1cfed1c5fa7d5c3b6d7726ef5217dffc430e8c991"} Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.627337 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerStarted","Data":"92a1b71aeb7e67cae559de493a423aa2624c77c669ed0f26d9367eb3c7be8311"} Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.629698 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e3ee493-6c0a-4519-ab88-6e3093595cba","Type":"ContainerStarted","Data":"571e0560a2c3bd0867c398b30c67b98bf29483c5986a92260415cfd8495b4472"} Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.629750 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e3ee493-6c0a-4519-ab88-6e3093595cba","Type":"ContainerStarted","Data":"4a104156915dcb256a1513d24e547321ba7266879600f8df5f96aa8684633cb1"} Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.629764 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e3ee493-6c0a-4519-ab88-6e3093595cba","Type":"ContainerStarted","Data":"f4083155b4e2cd24c086749af896915aefbe288cbc485e549d7d9305433b8cae"} Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.656292 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6562652719999997 podStartE2EDuration="2.656265272s" podCreationTimestamp="2025-10-03 10:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:26.648149932 +0000 UTC m=+1248.444781799" watchObservedRunningTime="2025-10-03 10:04:26.656265272 +0000 UTC m=+1248.452897129" Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.832614 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.832933 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-log" containerID="cri-o://a3ff65db3e039fe244fb39ebe85806f1e5811322d248180fad53bc31a7d05cb4" gracePeriod=30 Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.833548 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-api" containerID="cri-o://2820731a6f9d16c8bcb30e99bdc2a985f9a42273b228511e6077d428aa4968fc" gracePeriod=30 Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.844798 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.861895 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:26 crc kubenswrapper[4990]: I1003 10:04:26.883981 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be56917c-d7de-4294-8dfc-97ebddbd3240" path="/var/lib/kubelet/pods/be56917c-d7de-4294-8dfc-97ebddbd3240/volumes" Oct 03 10:04:27 crc kubenswrapper[4990]: I1003 10:04:27.642727 4990 generic.go:334] "Generic (PLEG): container finished" podID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerID="a3ff65db3e039fe244fb39ebe85806f1e5811322d248180fad53bc31a7d05cb4" exitCode=143 Oct 03 10:04:27 crc kubenswrapper[4990]: I1003 10:04:27.642818 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8db4914f-89b7-410f-b10a-40e6c1b7a921","Type":"ContainerDied","Data":"a3ff65db3e039fe244fb39ebe85806f1e5811322d248180fad53bc31a7d05cb4"} Oct 03 10:04:27 crc kubenswrapper[4990]: I1003 10:04:27.643384 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8b02d939-971c-4e77-8ca6-ff6a535e207b" containerName="nova-scheduler-scheduler" containerID="cri-o://17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80" gracePeriod=30 Oct 03 10:04:28 crc kubenswrapper[4990]: I1003 10:04:28.655567 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerStarted","Data":"9154c966301a043ec5d171a22e0f9746800a428a672c68e3044d67234cc77137"} Oct 03 10:04:28 crc kubenswrapper[4990]: I1003 10:04:28.656016 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 10:04:28 crc kubenswrapper[4990]: I1003 10:04:28.681717 4990 generic.go:334] "Generic (PLEG): container finished" podID="d21cb3a0-498d-4edf-b411-8a13ae88e221" containerID="72d2eb3b6894a8014c179dca4994554aac84f2a81598f222f9283646a4f6d6e6" exitCode=0 Oct 03 10:04:28 crc kubenswrapper[4990]: I1003 10:04:28.681852 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-67mbn" event={"ID":"d21cb3a0-498d-4edf-b411-8a13ae88e221","Type":"ContainerDied","Data":"72d2eb3b6894a8014c179dca4994554aac84f2a81598f222f9283646a4f6d6e6"} Oct 03 10:04:28 crc kubenswrapper[4990]: I1003 10:04:28.681942 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerName="nova-metadata-log" containerID="cri-o://4a104156915dcb256a1513d24e547321ba7266879600f8df5f96aa8684633cb1" gracePeriod=30 Oct 03 10:04:28 crc kubenswrapper[4990]: I1003 10:04:28.682276 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerName="nova-metadata-metadata" containerID="cri-o://571e0560a2c3bd0867c398b30c67b98bf29483c5986a92260415cfd8495b4472" gracePeriod=30 Oct 03 10:04:28 crc kubenswrapper[4990]: I1003 10:04:28.721769 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.911634656 podStartE2EDuration="6.721753874s" podCreationTimestamp="2025-10-03 10:04:22 +0000 UTC" firstStartedPulling="2025-10-03 10:04:23.660664397 +0000 UTC m=+1245.457296264" lastFinishedPulling="2025-10-03 10:04:27.470783615 +0000 UTC m=+1249.267415482" observedRunningTime="2025-10-03 10:04:28.717356748 +0000 UTC m=+1250.513988635" watchObservedRunningTime="2025-10-03 10:04:28.721753874 +0000 UTC m=+1250.518385731" Oct 03 10:04:28 crc kubenswrapper[4990]: E1003 10:04:28.938175 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3ee493_6c0a_4519_ab88_6e3093595cba.slice/crio-4a104156915dcb256a1513d24e547321ba7266879600f8df5f96aa8684633cb1.scope\": RecentStats: unable to find data in memory cache]" Oct 03 10:04:29 crc kubenswrapper[4990]: E1003 10:04:29.210580 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 10:04:29 crc kubenswrapper[4990]: E1003 10:04:29.212742 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 10:04:29 crc kubenswrapper[4990]: E1003 10:04:29.214247 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 10:04:29 crc kubenswrapper[4990]: E1003 10:04:29.214366 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8b02d939-971c-4e77-8ca6-ff6a535e207b" containerName="nova-scheduler-scheduler" Oct 03 10:04:29 crc kubenswrapper[4990]: I1003 10:04:29.695419 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerID="571e0560a2c3bd0867c398b30c67b98bf29483c5986a92260415cfd8495b4472" exitCode=0 Oct 03 10:04:29 crc kubenswrapper[4990]: I1003 10:04:29.695451 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerID="4a104156915dcb256a1513d24e547321ba7266879600f8df5f96aa8684633cb1" exitCode=143 Oct 03 10:04:29 crc kubenswrapper[4990]: I1003 10:04:29.695533 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e3ee493-6c0a-4519-ab88-6e3093595cba","Type":"ContainerDied","Data":"571e0560a2c3bd0867c398b30c67b98bf29483c5986a92260415cfd8495b4472"} Oct 03 10:04:29 crc kubenswrapper[4990]: I1003 10:04:29.695595 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e3ee493-6c0a-4519-ab88-6e3093595cba","Type":"ContainerDied","Data":"4a104156915dcb256a1513d24e547321ba7266879600f8df5f96aa8684633cb1"} Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.056245 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.056570 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.229373 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.233582 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.319994 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-scripts\") pod \"d21cb3a0-498d-4edf-b411-8a13ae88e221\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.320398 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-config-data\") pod \"d21cb3a0-498d-4edf-b411-8a13ae88e221\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.320430 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-nova-metadata-tls-certs\") pod \"3e3ee493-6c0a-4519-ab88-6e3093595cba\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.320465 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tm4w\" (UniqueName: \"kubernetes.io/projected/d21cb3a0-498d-4edf-b411-8a13ae88e221-kube-api-access-9tm4w\") pod \"d21cb3a0-498d-4edf-b411-8a13ae88e221\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.320517 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-combined-ca-bundle\") pod \"d21cb3a0-498d-4edf-b411-8a13ae88e221\" (UID: \"d21cb3a0-498d-4edf-b411-8a13ae88e221\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.320549 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3ee493-6c0a-4519-ab88-6e3093595cba-logs\") pod \"3e3ee493-6c0a-4519-ab88-6e3093595cba\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.320591 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-combined-ca-bundle\") pod \"3e3ee493-6c0a-4519-ab88-6e3093595cba\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.320633 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-config-data\") pod \"3e3ee493-6c0a-4519-ab88-6e3093595cba\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.320661 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb4pw\" (UniqueName: \"kubernetes.io/projected/3e3ee493-6c0a-4519-ab88-6e3093595cba-kube-api-access-xb4pw\") pod \"3e3ee493-6c0a-4519-ab88-6e3093595cba\" (UID: \"3e3ee493-6c0a-4519-ab88-6e3093595cba\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.323261 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e3ee493-6c0a-4519-ab88-6e3093595cba-logs" (OuterVolumeSpecName: "logs") pod "3e3ee493-6c0a-4519-ab88-6e3093595cba" (UID: "3e3ee493-6c0a-4519-ab88-6e3093595cba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.328021 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-scripts" (OuterVolumeSpecName: "scripts") pod "d21cb3a0-498d-4edf-b411-8a13ae88e221" (UID: "d21cb3a0-498d-4edf-b411-8a13ae88e221"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.335763 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3ee493-6c0a-4519-ab88-6e3093595cba-kube-api-access-xb4pw" (OuterVolumeSpecName: "kube-api-access-xb4pw") pod "3e3ee493-6c0a-4519-ab88-6e3093595cba" (UID: "3e3ee493-6c0a-4519-ab88-6e3093595cba"). InnerVolumeSpecName "kube-api-access-xb4pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.346046 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d21cb3a0-498d-4edf-b411-8a13ae88e221-kube-api-access-9tm4w" (OuterVolumeSpecName: "kube-api-access-9tm4w") pod "d21cb3a0-498d-4edf-b411-8a13ae88e221" (UID: "d21cb3a0-498d-4edf-b411-8a13ae88e221"). InnerVolumeSpecName "kube-api-access-9tm4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.357210 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d21cb3a0-498d-4edf-b411-8a13ae88e221" (UID: "d21cb3a0-498d-4edf-b411-8a13ae88e221"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.359634 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-config-data" (OuterVolumeSpecName: "config-data") pod "d21cb3a0-498d-4edf-b411-8a13ae88e221" (UID: "d21cb3a0-498d-4edf-b411-8a13ae88e221"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.362782 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e3ee493-6c0a-4519-ab88-6e3093595cba" (UID: "3e3ee493-6c0a-4519-ab88-6e3093595cba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.363864 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-config-data" (OuterVolumeSpecName: "config-data") pod "3e3ee493-6c0a-4519-ab88-6e3093595cba" (UID: "3e3ee493-6c0a-4519-ab88-6e3093595cba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.379842 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3e3ee493-6c0a-4519-ab88-6e3093595cba" (UID: "3e3ee493-6c0a-4519-ab88-6e3093595cba"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.387805 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425875 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425914 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425924 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb4pw\" (UniqueName: \"kubernetes.io/projected/3e3ee493-6c0a-4519-ab88-6e3093595cba-kube-api-access-xb4pw\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425935 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425944 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425953 4990 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3ee493-6c0a-4519-ab88-6e3093595cba-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425962 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tm4w\" (UniqueName: \"kubernetes.io/projected/d21cb3a0-498d-4edf-b411-8a13ae88e221-kube-api-access-9tm4w\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425970 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d21cb3a0-498d-4edf-b411-8a13ae88e221-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.425978 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3ee493-6c0a-4519-ab88-6e3093595cba-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.527496 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-combined-ca-bundle\") pod \"8b02d939-971c-4e77-8ca6-ff6a535e207b\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.527621 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6jxv\" (UniqueName: \"kubernetes.io/projected/8b02d939-971c-4e77-8ca6-ff6a535e207b-kube-api-access-x6jxv\") pod \"8b02d939-971c-4e77-8ca6-ff6a535e207b\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.527664 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-config-data\") pod \"8b02d939-971c-4e77-8ca6-ff6a535e207b\" (UID: \"8b02d939-971c-4e77-8ca6-ff6a535e207b\") " Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.530821 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b02d939-971c-4e77-8ca6-ff6a535e207b-kube-api-access-x6jxv" (OuterVolumeSpecName: "kube-api-access-x6jxv") pod "8b02d939-971c-4e77-8ca6-ff6a535e207b" (UID: "8b02d939-971c-4e77-8ca6-ff6a535e207b"). InnerVolumeSpecName "kube-api-access-x6jxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.554925 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b02d939-971c-4e77-8ca6-ff6a535e207b" (UID: "8b02d939-971c-4e77-8ca6-ff6a535e207b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.555828 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-config-data" (OuterVolumeSpecName: "config-data") pod "8b02d939-971c-4e77-8ca6-ff6a535e207b" (UID: "8b02d939-971c-4e77-8ca6-ff6a535e207b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.630270 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.630323 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6jxv\" (UniqueName: \"kubernetes.io/projected/8b02d939-971c-4e77-8ca6-ff6a535e207b-kube-api-access-x6jxv\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.630337 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02d939-971c-4e77-8ca6-ff6a535e207b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.705862 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3e3ee493-6c0a-4519-ab88-6e3093595cba","Type":"ContainerDied","Data":"f4083155b4e2cd24c086749af896915aefbe288cbc485e549d7d9305433b8cae"} Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.705912 4990 scope.go:117] "RemoveContainer" containerID="571e0560a2c3bd0867c398b30c67b98bf29483c5986a92260415cfd8495b4472" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.706026 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.720405 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-67mbn" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.720829 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-67mbn" event={"ID":"d21cb3a0-498d-4edf-b411-8a13ae88e221","Type":"ContainerDied","Data":"253f54911801ef6c632e01a3e6840ff3afcc08898f7cc7d8b9f8af09354a224b"} Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.720873 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253f54911801ef6c632e01a3e6840ff3afcc08898f7cc7d8b9f8af09354a224b" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.724434 4990 generic.go:334] "Generic (PLEG): container finished" podID="8b02d939-971c-4e77-8ca6-ff6a535e207b" containerID="17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80" exitCode=0 Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.724487 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b02d939-971c-4e77-8ca6-ff6a535e207b","Type":"ContainerDied","Data":"17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80"} Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.724525 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b02d939-971c-4e77-8ca6-ff6a535e207b","Type":"ContainerDied","Data":"b532fae658886f446e3e7fc5491580b33e47a82e925e85fb3a0c92fda9781b19"} Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.724583 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.740833 4990 scope.go:117] "RemoveContainer" containerID="4a104156915dcb256a1513d24e547321ba7266879600f8df5f96aa8684633cb1" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.775023 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.775755 4990 scope.go:117] "RemoveContainer" containerID="17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.807786 4990 scope.go:117] "RemoveContainer" containerID="17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80" Oct 03 10:04:30 crc kubenswrapper[4990]: E1003 10:04:30.810854 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80\": container with ID starting with 17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80 not found: ID does not exist" containerID="17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.810890 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80"} err="failed to get container status \"17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80\": rpc error: code = NotFound desc = could not find container \"17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80\": container with ID starting with 17ba12ef62fe633e228571d1fa18991119a81d17a0d2533a0ff01c715b32ee80 not found: ID does not exist" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.814239 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.828498 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.849597 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: E1003 10:04:30.850138 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be56917c-d7de-4294-8dfc-97ebddbd3240" containerName="init" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850161 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="be56917c-d7de-4294-8dfc-97ebddbd3240" containerName="init" Oct 03 10:04:30 crc kubenswrapper[4990]: E1003 10:04:30.850178 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccfb8f0-6106-4c88-87eb-78e0e8481518" containerName="nova-manage" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850187 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccfb8f0-6106-4c88-87eb-78e0e8481518" containerName="nova-manage" Oct 03 10:04:30 crc kubenswrapper[4990]: E1003 10:04:30.850205 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerName="nova-metadata-log" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850212 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerName="nova-metadata-log" Oct 03 10:04:30 crc kubenswrapper[4990]: E1003 10:04:30.850223 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerName="nova-metadata-metadata" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850230 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerName="nova-metadata-metadata" Oct 03 10:04:30 crc kubenswrapper[4990]: E1003 10:04:30.850252 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b02d939-971c-4e77-8ca6-ff6a535e207b" containerName="nova-scheduler-scheduler" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850259 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b02d939-971c-4e77-8ca6-ff6a535e207b" containerName="nova-scheduler-scheduler" Oct 03 10:04:30 crc kubenswrapper[4990]: E1003 10:04:30.850277 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be56917c-d7de-4294-8dfc-97ebddbd3240" containerName="dnsmasq-dns" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850287 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="be56917c-d7de-4294-8dfc-97ebddbd3240" containerName="dnsmasq-dns" Oct 03 10:04:30 crc kubenswrapper[4990]: E1003 10:04:30.850323 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d21cb3a0-498d-4edf-b411-8a13ae88e221" containerName="nova-cell1-conductor-db-sync" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850332 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d21cb3a0-498d-4edf-b411-8a13ae88e221" containerName="nova-cell1-conductor-db-sync" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850584 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="be56917c-d7de-4294-8dfc-97ebddbd3240" containerName="dnsmasq-dns" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850606 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerName="nova-metadata-log" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850615 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" containerName="nova-metadata-metadata" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850628 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccfb8f0-6106-4c88-87eb-78e0e8481518" containerName="nova-manage" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850635 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d21cb3a0-498d-4edf-b411-8a13ae88e221" containerName="nova-cell1-conductor-db-sync" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.850652 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b02d939-971c-4e77-8ca6-ff6a535e207b" containerName="nova-scheduler-scheduler" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.851953 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.852996 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.858461 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.858794 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.860143 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.867026 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.868313 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.870219 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.890879 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3ee493-6c0a-4519-ab88-6e3093595cba" path="/var/lib/kubelet/pods/3e3ee493-6c0a-4519-ab88-6e3093595cba/volumes" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.891679 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b02d939-971c-4e77-8ca6-ff6a535e207b" path="/var/lib/kubelet/pods/8b02d939-971c-4e77-8ca6-ff6a535e207b/volumes" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.892414 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.912696 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.914156 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.916836 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 10:04:30 crc kubenswrapper[4990]: I1003 10:04:30.944151 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041018 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041064 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041223 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbwpj\" (UniqueName: \"kubernetes.io/projected/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-kube-api-access-kbwpj\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041289 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041335 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdtr\" (UniqueName: \"kubernetes.io/projected/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-kube-api-access-hrdtr\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041409 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041459 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-config-data\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041506 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.041548 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-config-data\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.042017 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzqr9\" (UniqueName: \"kubernetes.io/projected/c29b310f-27f8-49df-a6a8-bd2a589cd807-kube-api-access-rzqr9\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.042061 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-logs\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.143254 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbwpj\" (UniqueName: \"kubernetes.io/projected/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-kube-api-access-kbwpj\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.143764 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.143976 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdtr\" (UniqueName: \"kubernetes.io/projected/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-kube-api-access-hrdtr\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.144418 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.144579 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-config-data\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.144695 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.144814 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-config-data\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.144952 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzqr9\" (UniqueName: \"kubernetes.io/projected/c29b310f-27f8-49df-a6a8-bd2a589cd807-kube-api-access-rzqr9\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.145063 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-logs\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.145466 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.145590 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.145537 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-logs\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.148014 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.148344 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.148885 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.152222 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-config-data\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.154039 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-config-data\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.154170 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.157016 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.163131 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdtr\" (UniqueName: \"kubernetes.io/projected/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-kube-api-access-hrdtr\") pod \"nova-cell1-conductor-0\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.166111 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzqr9\" (UniqueName: \"kubernetes.io/projected/c29b310f-27f8-49df-a6a8-bd2a589cd807-kube-api-access-rzqr9\") pod \"nova-scheduler-0\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.166212 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbwpj\" (UniqueName: \"kubernetes.io/projected/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-kube-api-access-kbwpj\") pod \"nova-metadata-0\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.180065 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.192560 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.285073 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.729040 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.737312 4990 generic.go:334] "Generic (PLEG): container finished" podID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerID="2820731a6f9d16c8bcb30e99bdc2a985f9a42273b228511e6077d428aa4968fc" exitCode=0 Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.737346 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8db4914f-89b7-410f-b10a-40e6c1b7a921","Type":"ContainerDied","Data":"2820731a6f9d16c8bcb30e99bdc2a985f9a42273b228511e6077d428aa4968fc"} Oct 03 10:04:31 crc kubenswrapper[4990]: W1003 10:04:31.740957 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc29b310f_27f8_49df_a6a8_bd2a589cd807.slice/crio-12065c8373202c09bbb1e03d9c083c4a81b36634f7e709e114ad425b73bd0866 WatchSource:0}: Error finding container 12065c8373202c09bbb1e03d9c083c4a81b36634f7e709e114ad425b73bd0866: Status 404 returned error can't find the container with id 12065c8373202c09bbb1e03d9c083c4a81b36634f7e709e114ad425b73bd0866 Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.869279 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 10:04:31 crc kubenswrapper[4990]: I1003 10:04:31.917274 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.232021 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.417921 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvgq6\" (UniqueName: \"kubernetes.io/projected/8db4914f-89b7-410f-b10a-40e6c1b7a921-kube-api-access-dvgq6\") pod \"8db4914f-89b7-410f-b10a-40e6c1b7a921\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.418022 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-config-data\") pod \"8db4914f-89b7-410f-b10a-40e6c1b7a921\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.418115 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-combined-ca-bundle\") pod \"8db4914f-89b7-410f-b10a-40e6c1b7a921\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.418248 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db4914f-89b7-410f-b10a-40e6c1b7a921-logs\") pod \"8db4914f-89b7-410f-b10a-40e6c1b7a921\" (UID: \"8db4914f-89b7-410f-b10a-40e6c1b7a921\") " Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.419542 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db4914f-89b7-410f-b10a-40e6c1b7a921-logs" (OuterVolumeSpecName: "logs") pod "8db4914f-89b7-410f-b10a-40e6c1b7a921" (UID: "8db4914f-89b7-410f-b10a-40e6c1b7a921"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.442952 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db4914f-89b7-410f-b10a-40e6c1b7a921-kube-api-access-dvgq6" (OuterVolumeSpecName: "kube-api-access-dvgq6") pod "8db4914f-89b7-410f-b10a-40e6c1b7a921" (UID: "8db4914f-89b7-410f-b10a-40e6c1b7a921"). InnerVolumeSpecName "kube-api-access-dvgq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.451022 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-config-data" (OuterVolumeSpecName: "config-data") pod "8db4914f-89b7-410f-b10a-40e6c1b7a921" (UID: "8db4914f-89b7-410f-b10a-40e6c1b7a921"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.451187 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8db4914f-89b7-410f-b10a-40e6c1b7a921" (UID: "8db4914f-89b7-410f-b10a-40e6c1b7a921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.523334 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db4914f-89b7-410f-b10a-40e6c1b7a921-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.523665 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvgq6\" (UniqueName: \"kubernetes.io/projected/8db4914f-89b7-410f-b10a-40e6c1b7a921-kube-api-access-dvgq6\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.523679 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.523693 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4914f-89b7-410f-b10a-40e6c1b7a921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.747931 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d34b5bf1-bd99-4b36-b73f-3b0cc470315a","Type":"ContainerStarted","Data":"4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09"} Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.747975 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d34b5bf1-bd99-4b36-b73f-3b0cc470315a","Type":"ContainerStarted","Data":"feec9417985c2fa032eef5e30d8b80c22eeebd8cf0a4ccee41751acdf18c427c"} Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.749750 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c29b310f-27f8-49df-a6a8-bd2a589cd807","Type":"ContainerStarted","Data":"1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e"} Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.749775 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c29b310f-27f8-49df-a6a8-bd2a589cd807","Type":"ContainerStarted","Data":"12065c8373202c09bbb1e03d9c083c4a81b36634f7e709e114ad425b73bd0866"} Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.751341 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ac157dc7-6df6-4f4f-ba65-c85b58f78fff","Type":"ContainerStarted","Data":"46e13f097dbadede2c3cc71a0e4ec9fe6d9a4c7164ffcc20da40533122fd3bc4"} Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.751406 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ac157dc7-6df6-4f4f-ba65-c85b58f78fff","Type":"ContainerStarted","Data":"a039d06cf9ab0385593ea4c8b98b9203f4e79c71af9d9a55959c54b61e5366cd"} Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.752480 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.787233 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.787210832 podStartE2EDuration="2.787210832s" podCreationTimestamp="2025-10-03 10:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:32.784951779 +0000 UTC m=+1254.581583646" watchObservedRunningTime="2025-10-03 10:04:32.787210832 +0000 UTC m=+1254.583842689" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.790231 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8db4914f-89b7-410f-b10a-40e6c1b7a921","Type":"ContainerDied","Data":"d7692d75d198198ff851d20eea5e7a3b8e2979f1f2ac5900f4fb69fd7a1b4403"} Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.790284 4990 scope.go:117] "RemoveContainer" containerID="2820731a6f9d16c8bcb30e99bdc2a985f9a42273b228511e6077d428aa4968fc" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.790424 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.821156 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.821136617 podStartE2EDuration="2.821136617s" podCreationTimestamp="2025-10-03 10:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:32.809023177 +0000 UTC m=+1254.605655054" watchObservedRunningTime="2025-10-03 10:04:32.821136617 +0000 UTC m=+1254.617768474" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.828283 4990 scope.go:117] "RemoveContainer" containerID="a3ff65db3e039fe244fb39ebe85806f1e5811322d248180fad53bc31a7d05cb4" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.834441 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.854015 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.864467 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:32 crc kubenswrapper[4990]: E1003 10:04:32.864939 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-api" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.864957 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-api" Oct 03 10:04:32 crc kubenswrapper[4990]: E1003 10:04:32.864986 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-log" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.864993 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-log" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.875047 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-log" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.875087 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" containerName="nova-api-api" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.883586 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.885960 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.903064 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db4914f-89b7-410f-b10a-40e6c1b7a921" path="/var/lib/kubelet/pods/8db4914f-89b7-410f-b10a-40e6c1b7a921/volumes" Oct 03 10:04:32 crc kubenswrapper[4990]: I1003 10:04:32.903779 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.035380 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-logs\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.035443 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-config-data\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.035550 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4css5\" (UniqueName: \"kubernetes.io/projected/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-kube-api-access-4css5\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.035606 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.136896 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4css5\" (UniqueName: \"kubernetes.io/projected/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-kube-api-access-4css5\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.137362 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.137483 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-logs\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.137532 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-config-data\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.138261 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-logs\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.141621 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-config-data\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.141834 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.153544 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4css5\" (UniqueName: \"kubernetes.io/projected/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-kube-api-access-4css5\") pod \"nova-api-0\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.210251 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.709864 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:33 crc kubenswrapper[4990]: W1003 10:04:33.713789 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ecc011a_e3b9_4a8b_aa40_b24a91477b6b.slice/crio-cae895b620efa377ac09e304467cf8f16c91d58e7e80158d0a5be350fc17babb WatchSource:0}: Error finding container cae895b620efa377ac09e304467cf8f16c91d58e7e80158d0a5be350fc17babb: Status 404 returned error can't find the container with id cae895b620efa377ac09e304467cf8f16c91d58e7e80158d0a5be350fc17babb Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.805492 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d34b5bf1-bd99-4b36-b73f-3b0cc470315a","Type":"ContainerStarted","Data":"7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd"} Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.806846 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b","Type":"ContainerStarted","Data":"cae895b620efa377ac09e304467cf8f16c91d58e7e80158d0a5be350fc17babb"} Oct 03 10:04:33 crc kubenswrapper[4990]: I1003 10:04:33.834487 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.83446532 podStartE2EDuration="3.83446532s" podCreationTimestamp="2025-10-03 10:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:33.826782565 +0000 UTC m=+1255.623414422" watchObservedRunningTime="2025-10-03 10:04:33.83446532 +0000 UTC m=+1255.631097177" Oct 03 10:04:34 crc kubenswrapper[4990]: I1003 10:04:34.824293 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b","Type":"ContainerStarted","Data":"17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9"} Oct 03 10:04:34 crc kubenswrapper[4990]: I1003 10:04:34.824766 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b","Type":"ContainerStarted","Data":"bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a"} Oct 03 10:04:34 crc kubenswrapper[4990]: I1003 10:04:34.850935 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.850914938 podStartE2EDuration="2.850914938s" podCreationTimestamp="2025-10-03 10:04:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:34.841971255 +0000 UTC m=+1256.638603132" watchObservedRunningTime="2025-10-03 10:04:34.850914938 +0000 UTC m=+1256.647546795" Oct 03 10:04:36 crc kubenswrapper[4990]: I1003 10:04:36.181809 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 10:04:36 crc kubenswrapper[4990]: I1003 10:04:36.182192 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 10:04:36 crc kubenswrapper[4990]: I1003 10:04:36.193578 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 10:04:41 crc kubenswrapper[4990]: I1003 10:04:41.181305 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 10:04:41 crc kubenswrapper[4990]: I1003 10:04:41.181918 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 10:04:41 crc kubenswrapper[4990]: I1003 10:04:41.192741 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 10:04:41 crc kubenswrapper[4990]: I1003 10:04:41.221815 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 10:04:41 crc kubenswrapper[4990]: I1003 10:04:41.323560 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 10:04:41 crc kubenswrapper[4990]: I1003 10:04:41.930944 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 10:04:42 crc kubenswrapper[4990]: I1003 10:04:42.199758 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 10:04:42 crc kubenswrapper[4990]: I1003 10:04:42.200121 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 10:04:43 crc kubenswrapper[4990]: I1003 10:04:43.211046 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 10:04:43 crc kubenswrapper[4990]: I1003 10:04:43.211379 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 10:04:44 crc kubenswrapper[4990]: I1003 10:04:44.293770 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 10:04:44 crc kubenswrapper[4990]: I1003 10:04:44.293793 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 10:04:51 crc kubenswrapper[4990]: I1003 10:04:51.188070 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 10:04:51 crc kubenswrapper[4990]: I1003 10:04:51.188761 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 10:04:51 crc kubenswrapper[4990]: I1003 10:04:51.194964 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 10:04:51 crc kubenswrapper[4990]: I1003 10:04:51.195030 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 10:04:52 crc kubenswrapper[4990]: I1003 10:04:52.931553 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:52 crc kubenswrapper[4990]: I1003 10:04:52.993233 4990 generic.go:334] "Generic (PLEG): container finished" podID="6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" containerID="08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182" exitCode=137 Oct 03 10:04:52 crc kubenswrapper[4990]: I1003 10:04:52.993276 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:52 crc kubenswrapper[4990]: I1003 10:04:52.993298 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e","Type":"ContainerDied","Data":"08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182"} Oct 03 10:04:52 crc kubenswrapper[4990]: I1003 10:04:52.993350 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e","Type":"ContainerDied","Data":"4f0e338210e994a86fff92deaf3ed83ad54e6c94bb58ef639fca6825c949df3a"} Oct 03 10:04:52 crc kubenswrapper[4990]: I1003 10:04:52.993386 4990 scope.go:117] "RemoveContainer" containerID="08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.022775 4990 scope.go:117] "RemoveContainer" containerID="08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182" Oct 03 10:04:53 crc kubenswrapper[4990]: E1003 10:04:53.023257 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182\": container with ID starting with 08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182 not found: ID does not exist" containerID="08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.023296 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182"} err="failed to get container status \"08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182\": rpc error: code = NotFound desc = could not find container \"08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182\": container with ID starting with 08b7d221cebe3a6e249706c7e47b20d35b7c3355b93cee7cbfcbde63bf042182 not found: ID does not exist" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.035118 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-config-data\") pod \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.035288 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2n82\" (UniqueName: \"kubernetes.io/projected/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-kube-api-access-w2n82\") pod \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.035414 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-combined-ca-bundle\") pod \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\" (UID: \"6ce54404-9a7e-44eb-85e4-99a2ef1fb63e\") " Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.043557 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-kube-api-access-w2n82" (OuterVolumeSpecName: "kube-api-access-w2n82") pod "6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" (UID: "6ce54404-9a7e-44eb-85e4-99a2ef1fb63e"). InnerVolumeSpecName "kube-api-access-w2n82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.064395 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.069675 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" (UID: "6ce54404-9a7e-44eb-85e4-99a2ef1fb63e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.074205 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-config-data" (OuterVolumeSpecName: "config-data") pod "6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" (UID: "6ce54404-9a7e-44eb-85e4-99a2ef1fb63e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.137426 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2n82\" (UniqueName: \"kubernetes.io/projected/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-kube-api-access-w2n82\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.137464 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.137476 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.214856 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.214923 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.215394 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.215423 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.217490 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.218244 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.334542 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.344356 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.380594 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:53 crc kubenswrapper[4990]: E1003 10:04:53.381023 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.381039 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.381208 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.381809 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.394316 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.399064 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.402735 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.408757 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.450156 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.450278 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.450338 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzm55\" (UniqueName: \"kubernetes.io/projected/d5b89027-cf5d-4807-adc3-b4915304f1f2-kube-api-access-hzm55\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.450437 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.450534 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.500579 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-766c7d6945-qr5ht"] Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.513854 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.520902 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-766c7d6945-qr5ht"] Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552584 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552674 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552712 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-nb\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552737 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzm55\" (UniqueName: \"kubernetes.io/projected/d5b89027-cf5d-4807-adc3-b4915304f1f2-kube-api-access-hzm55\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552758 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-sb\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552794 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-config\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552822 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-swift-storage-0\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552871 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfvx7\" (UniqueName: \"kubernetes.io/projected/d7c15976-1c83-43b6-8077-6af8ecc010dc-kube-api-access-lfvx7\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552903 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-svc\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.552934 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.553026 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.580475 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.582251 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.582937 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.583480 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.614378 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzm55\" (UniqueName: \"kubernetes.io/projected/d5b89027-cf5d-4807-adc3-b4915304f1f2-kube-api-access-hzm55\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.654388 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-sb\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.654473 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-config\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.654497 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-swift-storage-0\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.654581 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfvx7\" (UniqueName: \"kubernetes.io/projected/d7c15976-1c83-43b6-8077-6af8ecc010dc-kube-api-access-lfvx7\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.654613 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-svc\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.654767 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-nb\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.655890 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-nb\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.656497 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-sb\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.657155 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-config\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.657803 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-swift-storage-0\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.658686 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-svc\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.689192 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfvx7\" (UniqueName: \"kubernetes.io/projected/d7c15976-1c83-43b6-8077-6af8ecc010dc-kube-api-access-lfvx7\") pod \"dnsmasq-dns-766c7d6945-qr5ht\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.700465 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:53 crc kubenswrapper[4990]: I1003 10:04:53.853955 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:54 crc kubenswrapper[4990]: I1003 10:04:54.238838 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:04:54 crc kubenswrapper[4990]: I1003 10:04:54.358791 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-766c7d6945-qr5ht"] Oct 03 10:04:54 crc kubenswrapper[4990]: W1003 10:04:54.381228 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7c15976_1c83_43b6_8077_6af8ecc010dc.slice/crio-db978c5278aadddd79c7a6ab5b1e6c320d71f5d5fefbe60a9435be597490149c WatchSource:0}: Error finding container db978c5278aadddd79c7a6ab5b1e6c320d71f5d5fefbe60a9435be597490149c: Status 404 returned error can't find the container with id db978c5278aadddd79c7a6ab5b1e6c320d71f5d5fefbe60a9435be597490149c Oct 03 10:04:54 crc kubenswrapper[4990]: I1003 10:04:54.884491 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce54404-9a7e-44eb-85e4-99a2ef1fb63e" path="/var/lib/kubelet/pods/6ce54404-9a7e-44eb-85e4-99a2ef1fb63e/volumes" Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.013073 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5b89027-cf5d-4807-adc3-b4915304f1f2","Type":"ContainerStarted","Data":"5d577a29b7ee9a04a2b18df67e6f481c57ad2dbffb24a317b5c6b4aaad21f535"} Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.013127 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5b89027-cf5d-4807-adc3-b4915304f1f2","Type":"ContainerStarted","Data":"9b8f7e8d845cb735aace0c87a789a75c326074ab92ec2b02a341266267dc4d56"} Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.014732 4990 generic.go:334] "Generic (PLEG): container finished" podID="d7c15976-1c83-43b6-8077-6af8ecc010dc" containerID="80d90ac04b21b6c1e6e2c272077df839f5fc0b7edc4dff8dce45383e46f9d909" exitCode=0 Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.014820 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" event={"ID":"d7c15976-1c83-43b6-8077-6af8ecc010dc","Type":"ContainerDied","Data":"80d90ac04b21b6c1e6e2c272077df839f5fc0b7edc4dff8dce45383e46f9d909"} Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.014857 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" event={"ID":"d7c15976-1c83-43b6-8077-6af8ecc010dc","Type":"ContainerStarted","Data":"db978c5278aadddd79c7a6ab5b1e6c320d71f5d5fefbe60a9435be597490149c"} Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.046656 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.046633328 podStartE2EDuration="2.046633328s" podCreationTimestamp="2025-10-03 10:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:55.028804935 +0000 UTC m=+1276.825436792" watchObservedRunningTime="2025-10-03 10:04:55.046633328 +0000 UTC m=+1276.843265185" Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.303899 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.303967 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.361001 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.361310 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="ceilometer-central-agent" containerID="cri-o://bb89622637cbc905e62915a052bd2f9da0291eefe6fa34e3aa8c51ef8d6aeba5" gracePeriod=30 Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.361359 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="proxy-httpd" containerID="cri-o://9154c966301a043ec5d171a22e0f9746800a428a672c68e3044d67234cc77137" gracePeriod=30 Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.361435 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="ceilometer-notification-agent" containerID="cri-o://92a1b71aeb7e67cae559de493a423aa2624c77c669ed0f26d9367eb3c7be8311" gracePeriod=30 Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.361426 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="sg-core" containerID="cri-o://996342a50d0cd52c6936d7b1cfed1c5fa7d5c3b6d7726ef5217dffc430e8c991" gracePeriod=30 Oct 03 10:04:55 crc kubenswrapper[4990]: I1003 10:04:55.909230 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.026827 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" event={"ID":"d7c15976-1c83-43b6-8077-6af8ecc010dc","Type":"ContainerStarted","Data":"24f083714d9fadf03161dcac11ebfaf3e1e57373e8285bfc2cfe8dda04b30139"} Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.026954 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.030157 4990 generic.go:334] "Generic (PLEG): container finished" podID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerID="9154c966301a043ec5d171a22e0f9746800a428a672c68e3044d67234cc77137" exitCode=0 Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.030190 4990 generic.go:334] "Generic (PLEG): container finished" podID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerID="996342a50d0cd52c6936d7b1cfed1c5fa7d5c3b6d7726ef5217dffc430e8c991" exitCode=2 Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.030201 4990 generic.go:334] "Generic (PLEG): container finished" podID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerID="bb89622637cbc905e62915a052bd2f9da0291eefe6fa34e3aa8c51ef8d6aeba5" exitCode=0 Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.030971 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerDied","Data":"9154c966301a043ec5d171a22e0f9746800a428a672c68e3044d67234cc77137"} Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.031009 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerDied","Data":"996342a50d0cd52c6936d7b1cfed1c5fa7d5c3b6d7726ef5217dffc430e8c991"} Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.031023 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerDied","Data":"bb89622637cbc905e62915a052bd2f9da0291eefe6fa34e3aa8c51ef8d6aeba5"} Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.031149 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-log" containerID="cri-o://bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a" gracePeriod=30 Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.031427 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-api" containerID="cri-o://17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9" gracePeriod=30 Oct 03 10:04:56 crc kubenswrapper[4990]: I1003 10:04:56.058865 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" podStartSLOduration=3.058846033 podStartE2EDuration="3.058846033s" podCreationTimestamp="2025-10-03 10:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:04:56.049607568 +0000 UTC m=+1277.846239425" watchObservedRunningTime="2025-10-03 10:04:56.058846033 +0000 UTC m=+1277.855477890" Oct 03 10:04:57 crc kubenswrapper[4990]: I1003 10:04:57.041157 4990 generic.go:334] "Generic (PLEG): container finished" podID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerID="bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a" exitCode=143 Oct 03 10:04:57 crc kubenswrapper[4990]: I1003 10:04:57.041226 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b","Type":"ContainerDied","Data":"bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a"} Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.069781 4990 generic.go:334] "Generic (PLEG): container finished" podID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerID="92a1b71aeb7e67cae559de493a423aa2624c77c669ed0f26d9367eb3c7be8311" exitCode=0 Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.069874 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerDied","Data":"92a1b71aeb7e67cae559de493a423aa2624c77c669ed0f26d9367eb3c7be8311"} Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.156957 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.243572 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-log-httpd\") pod \"ff8a0fbd-0f1a-4917-8467-190e72695912\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.243837 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjjbv\" (UniqueName: \"kubernetes.io/projected/ff8a0fbd-0f1a-4917-8467-190e72695912-kube-api-access-bjjbv\") pod \"ff8a0fbd-0f1a-4917-8467-190e72695912\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.243966 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-scripts\") pod \"ff8a0fbd-0f1a-4917-8467-190e72695912\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.244088 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-sg-core-conf-yaml\") pod \"ff8a0fbd-0f1a-4917-8467-190e72695912\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.244256 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-config-data\") pod \"ff8a0fbd-0f1a-4917-8467-190e72695912\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.244369 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-combined-ca-bundle\") pod \"ff8a0fbd-0f1a-4917-8467-190e72695912\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.244500 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-ceilometer-tls-certs\") pod \"ff8a0fbd-0f1a-4917-8467-190e72695912\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.244669 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-run-httpd\") pod \"ff8a0fbd-0f1a-4917-8467-190e72695912\" (UID: \"ff8a0fbd-0f1a-4917-8467-190e72695912\") " Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.245498 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ff8a0fbd-0f1a-4917-8467-190e72695912" (UID: "ff8a0fbd-0f1a-4917-8467-190e72695912"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.245839 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ff8a0fbd-0f1a-4917-8467-190e72695912" (UID: "ff8a0fbd-0f1a-4917-8467-190e72695912"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.250427 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8a0fbd-0f1a-4917-8467-190e72695912-kube-api-access-bjjbv" (OuterVolumeSpecName: "kube-api-access-bjjbv") pod "ff8a0fbd-0f1a-4917-8467-190e72695912" (UID: "ff8a0fbd-0f1a-4917-8467-190e72695912"). InnerVolumeSpecName "kube-api-access-bjjbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.252690 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-scripts" (OuterVolumeSpecName: "scripts") pod "ff8a0fbd-0f1a-4917-8467-190e72695912" (UID: "ff8a0fbd-0f1a-4917-8467-190e72695912"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.282351 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ff8a0fbd-0f1a-4917-8467-190e72695912" (UID: "ff8a0fbd-0f1a-4917-8467-190e72695912"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.312633 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ff8a0fbd-0f1a-4917-8467-190e72695912" (UID: "ff8a0fbd-0f1a-4917-8467-190e72695912"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.328704 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff8a0fbd-0f1a-4917-8467-190e72695912" (UID: "ff8a0fbd-0f1a-4917-8467-190e72695912"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.346871 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.346908 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.346920 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.346930 4990 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.346947 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.346960 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a0fbd-0f1a-4917-8467-190e72695912-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.346969 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjjbv\" (UniqueName: \"kubernetes.io/projected/ff8a0fbd-0f1a-4917-8467-190e72695912-kube-api-access-bjjbv\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.356596 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-config-data" (OuterVolumeSpecName: "config-data") pod "ff8a0fbd-0f1a-4917-8467-190e72695912" (UID: "ff8a0fbd-0f1a-4917-8467-190e72695912"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.448978 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8a0fbd-0f1a-4917-8467-190e72695912-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:58 crc kubenswrapper[4990]: I1003 10:04:58.701760 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.081399 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff8a0fbd-0f1a-4917-8467-190e72695912","Type":"ContainerDied","Data":"861e0a5bd75bb41ca37aca1bc0116c922699b7803320543bba0bb7620d92a5a6"} Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.081462 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.081822 4990 scope.go:117] "RemoveContainer" containerID="9154c966301a043ec5d171a22e0f9746800a428a672c68e3044d67234cc77137" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.109433 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.110313 4990 scope.go:117] "RemoveContainer" containerID="996342a50d0cd52c6936d7b1cfed1c5fa7d5c3b6d7726ef5217dffc430e8c991" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.119329 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.131451 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:59 crc kubenswrapper[4990]: E1003 10:04:59.132013 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="ceilometer-notification-agent" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.132033 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="ceilometer-notification-agent" Oct 03 10:04:59 crc kubenswrapper[4990]: E1003 10:04:59.132044 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="proxy-httpd" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.132052 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="proxy-httpd" Oct 03 10:04:59 crc kubenswrapper[4990]: E1003 10:04:59.132070 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="sg-core" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.132075 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="sg-core" Oct 03 10:04:59 crc kubenswrapper[4990]: E1003 10:04:59.132089 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="ceilometer-central-agent" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.132095 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="ceilometer-central-agent" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.132265 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="ceilometer-central-agent" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.132277 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="ceilometer-notification-agent" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.132299 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="proxy-httpd" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.132308 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" containerName="sg-core" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.137842 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.141407 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.141660 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.141872 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.147683 4990 scope.go:117] "RemoveContainer" containerID="92a1b71aeb7e67cae559de493a423aa2624c77c669ed0f26d9367eb3c7be8311" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.164466 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-log-httpd\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.164688 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4xl4\" (UniqueName: \"kubernetes.io/projected/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-kube-api-access-z4xl4\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.164770 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-scripts\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.164792 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-run-httpd\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.164862 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.164892 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-config-data\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.164927 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.164965 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.172942 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.183324 4990 scope.go:117] "RemoveContainer" containerID="bb89622637cbc905e62915a052bd2f9da0291eefe6fa34e3aa8c51ef8d6aeba5" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266194 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266239 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-config-data\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266268 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266294 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266371 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-log-httpd\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266405 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4xl4\" (UniqueName: \"kubernetes.io/projected/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-kube-api-access-z4xl4\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266432 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-scripts\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266450 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-run-httpd\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.266866 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-run-httpd\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.267596 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-log-httpd\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.270656 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.271656 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.272112 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-config-data\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.272310 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-scripts\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.272406 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.287328 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4xl4\" (UniqueName: \"kubernetes.io/projected/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-kube-api-access-z4xl4\") pod \"ceilometer-0\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.470398 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.560679 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.570475 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-config-data\") pod \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.570564 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-logs\") pod \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.570645 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4css5\" (UniqueName: \"kubernetes.io/projected/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-kube-api-access-4css5\") pod \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.570698 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-combined-ca-bundle\") pod \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\" (UID: \"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b\") " Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.574042 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-logs" (OuterVolumeSpecName: "logs") pod "4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" (UID: "4ecc011a-e3b9-4a8b-aa40-b24a91477b6b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.597868 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-kube-api-access-4css5" (OuterVolumeSpecName: "kube-api-access-4css5") pod "4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" (UID: "4ecc011a-e3b9-4a8b-aa40-b24a91477b6b"). InnerVolumeSpecName "kube-api-access-4css5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.611455 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" (UID: "4ecc011a-e3b9-4a8b-aa40-b24a91477b6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.671973 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4css5\" (UniqueName: \"kubernetes.io/projected/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-kube-api-access-4css5\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.672000 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.672012 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.675190 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-config-data" (OuterVolumeSpecName: "config-data") pod "4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" (UID: "4ecc011a-e3b9-4a8b-aa40-b24a91477b6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.777460 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:04:59 crc kubenswrapper[4990]: I1003 10:04:59.984700 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.092643 4990 generic.go:334] "Generic (PLEG): container finished" podID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerID="17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9" exitCode=0 Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.092726 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.092776 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b","Type":"ContainerDied","Data":"17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9"} Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.092816 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ecc011a-e3b9-4a8b-aa40-b24a91477b6b","Type":"ContainerDied","Data":"cae895b620efa377ac09e304467cf8f16c91d58e7e80158d0a5be350fc17babb"} Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.092836 4990 scope.go:117] "RemoveContainer" containerID="17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.096862 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerStarted","Data":"6d7649f24ac5f6250b76e795a72b31471b440e7aac2b79862868b1216833db8e"} Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.119978 4990 scope.go:117] "RemoveContainer" containerID="bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.145097 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.155234 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.162050 4990 scope.go:117] "RemoveContainer" containerID="17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9" Oct 03 10:05:00 crc kubenswrapper[4990]: E1003 10:05:00.162721 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9\": container with ID starting with 17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9 not found: ID does not exist" containerID="17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.162758 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9"} err="failed to get container status \"17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9\": rpc error: code = NotFound desc = could not find container \"17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9\": container with ID starting with 17c19850e73394918ec6b64c523916b6cf2955cb12bb416ddcf0d9738dc842d9 not found: ID does not exist" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.162785 4990 scope.go:117] "RemoveContainer" containerID="bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a" Oct 03 10:05:00 crc kubenswrapper[4990]: E1003 10:05:00.163102 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a\": container with ID starting with bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a not found: ID does not exist" containerID="bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.163132 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a"} err="failed to get container status \"bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a\": rpc error: code = NotFound desc = could not find container \"bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a\": container with ID starting with bf7e8eb7d01e779e81cee1f93ec78e64dd203bc3db8ea237f8495bfa110a665a not found: ID does not exist" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.167321 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:00 crc kubenswrapper[4990]: E1003 10:05:00.167804 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-api" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.167825 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-api" Oct 03 10:05:00 crc kubenswrapper[4990]: E1003 10:05:00.167866 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-log" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.167875 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-log" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.168095 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-api" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.168124 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" containerName="nova-api-log" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.169310 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.172486 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.172693 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.172829 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.178263 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.208720 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.208764 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8933274f-a924-476d-805b-1a4764530a52-logs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.208823 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.208849 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mclcx\" (UniqueName: \"kubernetes.io/projected/8933274f-a924-476d-805b-1a4764530a52-kube-api-access-mclcx\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.208987 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-public-tls-certs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.209107 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-config-data\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.311130 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-config-data\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.311536 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.311563 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8933274f-a924-476d-805b-1a4764530a52-logs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.311621 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.311646 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mclcx\" (UniqueName: \"kubernetes.io/projected/8933274f-a924-476d-805b-1a4764530a52-kube-api-access-mclcx\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.311692 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-public-tls-certs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.312277 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8933274f-a924-476d-805b-1a4764530a52-logs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.320377 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.326153 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-public-tls-certs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.326420 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-config-data\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.327126 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.330853 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mclcx\" (UniqueName: \"kubernetes.io/projected/8933274f-a924-476d-805b-1a4764530a52-kube-api-access-mclcx\") pod \"nova-api-0\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.519457 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.884968 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ecc011a-e3b9-4a8b-aa40-b24a91477b6b" path="/var/lib/kubelet/pods/4ecc011a-e3b9-4a8b-aa40-b24a91477b6b/volumes" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.886069 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8a0fbd-0f1a-4917-8467-190e72695912" path="/var/lib/kubelet/pods/ff8a0fbd-0f1a-4917-8467-190e72695912/volumes" Oct 03 10:05:00 crc kubenswrapper[4990]: I1003 10:05:00.981575 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:01 crc kubenswrapper[4990]: I1003 10:05:01.120658 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerStarted","Data":"0cf5ac29746ce882d6ae1c7168250fbf34eb77ced233198b98d8e39f0ab37bd4"} Oct 03 10:05:01 crc kubenswrapper[4990]: I1003 10:05:01.122461 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8933274f-a924-476d-805b-1a4764530a52","Type":"ContainerStarted","Data":"10bbe2a48dee7a9c6851c10f419a6f513a6e5d9a57f03edd2f2cddcdfa46acc7"} Oct 03 10:05:02 crc kubenswrapper[4990]: I1003 10:05:02.132578 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerStarted","Data":"f726157640cde355a6b4fde9ac87cd11f712f5e45c77f82242ffce8ed67bd078"} Oct 03 10:05:02 crc kubenswrapper[4990]: I1003 10:05:02.134178 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8933274f-a924-476d-805b-1a4764530a52","Type":"ContainerStarted","Data":"65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c"} Oct 03 10:05:02 crc kubenswrapper[4990]: I1003 10:05:02.134204 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8933274f-a924-476d-805b-1a4764530a52","Type":"ContainerStarted","Data":"9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3"} Oct 03 10:05:02 crc kubenswrapper[4990]: I1003 10:05:02.155659 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.155642101 podStartE2EDuration="2.155642101s" podCreationTimestamp="2025-10-03 10:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:05:02.154907652 +0000 UTC m=+1283.951539509" watchObservedRunningTime="2025-10-03 10:05:02.155642101 +0000 UTC m=+1283.952273958" Oct 03 10:05:03 crc kubenswrapper[4990]: I1003 10:05:03.144887 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerStarted","Data":"77511c0f1dafbe0f6a7ce5a4f15e45796b6eadf3587bf8ee1730bb9c4a726c6d"} Oct 03 10:05:03 crc kubenswrapper[4990]: I1003 10:05:03.701832 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:05:03 crc kubenswrapper[4990]: I1003 10:05:03.725322 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:05:03 crc kubenswrapper[4990]: I1003 10:05:03.855755 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:05:03 crc kubenswrapper[4990]: I1003 10:05:03.921013 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5577d7975c-6nzt4"] Oct 03 10:05:03 crc kubenswrapper[4990]: I1003 10:05:03.921366 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" podUID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerName="dnsmasq-dns" containerID="cri-o://24992578c77dff6b57091b7e6c8588c10de059bad8e64877ece6521dbc3ef205" gracePeriod=10 Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.160423 4990 generic.go:334] "Generic (PLEG): container finished" podID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerID="24992578c77dff6b57091b7e6c8588c10de059bad8e64877ece6521dbc3ef205" exitCode=0 Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.160545 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" event={"ID":"c2ec452a-525c-458f-b876-dbef8ff507f4","Type":"ContainerDied","Data":"24992578c77dff6b57091b7e6c8588c10de059bad8e64877ece6521dbc3ef205"} Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.179960 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.363372 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gwhsf"] Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.365427 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.368065 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.368242 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.389076 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gwhsf"] Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.530830 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.536555 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-scripts\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.536653 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-config-data\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.536736 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.536802 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7rh\" (UniqueName: \"kubernetes.io/projected/3bba0b30-30c2-4170-9455-5c1e16be0844-kube-api-access-lq7rh\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.638278 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-swift-storage-0\") pod \"c2ec452a-525c-458f-b876-dbef8ff507f4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.638589 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-config\") pod \"c2ec452a-525c-458f-b876-dbef8ff507f4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.638645 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-nb\") pod \"c2ec452a-525c-458f-b876-dbef8ff507f4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.638683 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-sb\") pod \"c2ec452a-525c-458f-b876-dbef8ff507f4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.638783 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-svc\") pod \"c2ec452a-525c-458f-b876-dbef8ff507f4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.638903 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s48sx\" (UniqueName: \"kubernetes.io/projected/c2ec452a-525c-458f-b876-dbef8ff507f4-kube-api-access-s48sx\") pod \"c2ec452a-525c-458f-b876-dbef8ff507f4\" (UID: \"c2ec452a-525c-458f-b876-dbef8ff507f4\") " Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.639141 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7rh\" (UniqueName: \"kubernetes.io/projected/3bba0b30-30c2-4170-9455-5c1e16be0844-kube-api-access-lq7rh\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.639232 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-scripts\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.639279 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-config-data\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.639319 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.648681 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ec452a-525c-458f-b876-dbef8ff507f4-kube-api-access-s48sx" (OuterVolumeSpecName: "kube-api-access-s48sx") pod "c2ec452a-525c-458f-b876-dbef8ff507f4" (UID: "c2ec452a-525c-458f-b876-dbef8ff507f4"). InnerVolumeSpecName "kube-api-access-s48sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.649810 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-scripts\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.654169 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-config-data\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.660439 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.671655 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7rh\" (UniqueName: \"kubernetes.io/projected/3bba0b30-30c2-4170-9455-5c1e16be0844-kube-api-access-lq7rh\") pod \"nova-cell1-cell-mapping-gwhsf\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.697014 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c2ec452a-525c-458f-b876-dbef8ff507f4" (UID: "c2ec452a-525c-458f-b876-dbef8ff507f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.697566 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c2ec452a-525c-458f-b876-dbef8ff507f4" (UID: "c2ec452a-525c-458f-b876-dbef8ff507f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.698287 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-config" (OuterVolumeSpecName: "config") pod "c2ec452a-525c-458f-b876-dbef8ff507f4" (UID: "c2ec452a-525c-458f-b876-dbef8ff507f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.704111 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.734050 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2ec452a-525c-458f-b876-dbef8ff507f4" (UID: "c2ec452a-525c-458f-b876-dbef8ff507f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.743014 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.743053 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.743065 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.743075 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s48sx\" (UniqueName: \"kubernetes.io/projected/c2ec452a-525c-458f-b876-dbef8ff507f4-kube-api-access-s48sx\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.743084 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.744074 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2ec452a-525c-458f-b876-dbef8ff507f4" (UID: "c2ec452a-525c-458f-b876-dbef8ff507f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:04 crc kubenswrapper[4990]: I1003 10:05:04.845358 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2ec452a-525c-458f-b876-dbef8ff507f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.174451 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerStarted","Data":"9e36ab3093e7df92e341c1eb14c639feff278ab0a3155acc7a3df5e7970a5bb6"} Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.175774 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.186935 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" event={"ID":"c2ec452a-525c-458f-b876-dbef8ff507f4","Type":"ContainerDied","Data":"a1c08ab9ae75d4710dfcfb5af700b63cd3ecfea691eaa6c3ba960531194bf1ad"} Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.187244 4990 scope.go:117] "RemoveContainer" containerID="24992578c77dff6b57091b7e6c8588c10de059bad8e64877ece6521dbc3ef205" Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.186988 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.195185 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6345858949999998 podStartE2EDuration="6.195164709s" podCreationTimestamp="2025-10-03 10:04:59 +0000 UTC" firstStartedPulling="2025-10-03 10:05:00.003679008 +0000 UTC m=+1281.800310865" lastFinishedPulling="2025-10-03 10:05:04.564257822 +0000 UTC m=+1286.360889679" observedRunningTime="2025-10-03 10:05:05.193023664 +0000 UTC m=+1286.989655531" watchObservedRunningTime="2025-10-03 10:05:05.195164709 +0000 UTC m=+1286.991796566" Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.228758 4990 scope.go:117] "RemoveContainer" containerID="9cdcd500d5975a53eb34c68565bee9aecf2028ebc08b6a9bd8bf5b593c18a194" Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.238796 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gwhsf"] Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.246871 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5577d7975c-6nzt4"] Oct 03 10:05:05 crc kubenswrapper[4990]: I1003 10:05:05.255151 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5577d7975c-6nzt4"] Oct 03 10:05:06 crc kubenswrapper[4990]: I1003 10:05:06.201770 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gwhsf" event={"ID":"3bba0b30-30c2-4170-9455-5c1e16be0844","Type":"ContainerStarted","Data":"dcbd5d737932527d27f958573242ee91326051d3e9490ecdb70837f5f700c9d3"} Oct 03 10:05:06 crc kubenswrapper[4990]: I1003 10:05:06.202168 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gwhsf" event={"ID":"3bba0b30-30c2-4170-9455-5c1e16be0844","Type":"ContainerStarted","Data":"e400a222c9d229c06f13e5063aa9e6facccf23d8b48714de6ca34d9e23762779"} Oct 03 10:05:06 crc kubenswrapper[4990]: I1003 10:05:06.221235 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gwhsf" podStartSLOduration=2.221212985 podStartE2EDuration="2.221212985s" podCreationTimestamp="2025-10-03 10:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:05:06.218729942 +0000 UTC m=+1288.015361819" watchObservedRunningTime="2025-10-03 10:05:06.221212985 +0000 UTC m=+1288.017844842" Oct 03 10:05:06 crc kubenswrapper[4990]: I1003 10:05:06.882782 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ec452a-525c-458f-b876-dbef8ff507f4" path="/var/lib/kubelet/pods/c2ec452a-525c-458f-b876-dbef8ff507f4/volumes" Oct 03 10:05:09 crc kubenswrapper[4990]: I1003 10:05:09.403413 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5577d7975c-6nzt4" podUID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: i/o timeout" Oct 03 10:05:10 crc kubenswrapper[4990]: I1003 10:05:10.520682 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 10:05:10 crc kubenswrapper[4990]: I1003 10:05:10.521073 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 10:05:11 crc kubenswrapper[4990]: I1003 10:05:11.254170 4990 generic.go:334] "Generic (PLEG): container finished" podID="3bba0b30-30c2-4170-9455-5c1e16be0844" containerID="dcbd5d737932527d27f958573242ee91326051d3e9490ecdb70837f5f700c9d3" exitCode=0 Oct 03 10:05:11 crc kubenswrapper[4990]: I1003 10:05:11.254211 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gwhsf" event={"ID":"3bba0b30-30c2-4170-9455-5c1e16be0844","Type":"ContainerDied","Data":"dcbd5d737932527d27f958573242ee91326051d3e9490ecdb70837f5f700c9d3"} Oct 03 10:05:11 crc kubenswrapper[4990]: I1003 10:05:11.535697 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 10:05:11 crc kubenswrapper[4990]: I1003 10:05:11.535738 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.643594 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.811858 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-combined-ca-bundle\") pod \"3bba0b30-30c2-4170-9455-5c1e16be0844\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.811949 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-scripts\") pod \"3bba0b30-30c2-4170-9455-5c1e16be0844\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.811982 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq7rh\" (UniqueName: \"kubernetes.io/projected/3bba0b30-30c2-4170-9455-5c1e16be0844-kube-api-access-lq7rh\") pod \"3bba0b30-30c2-4170-9455-5c1e16be0844\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.812045 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-config-data\") pod \"3bba0b30-30c2-4170-9455-5c1e16be0844\" (UID: \"3bba0b30-30c2-4170-9455-5c1e16be0844\") " Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.819974 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-scripts" (OuterVolumeSpecName: "scripts") pod "3bba0b30-30c2-4170-9455-5c1e16be0844" (UID: "3bba0b30-30c2-4170-9455-5c1e16be0844"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.821564 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bba0b30-30c2-4170-9455-5c1e16be0844-kube-api-access-lq7rh" (OuterVolumeSpecName: "kube-api-access-lq7rh") pod "3bba0b30-30c2-4170-9455-5c1e16be0844" (UID: "3bba0b30-30c2-4170-9455-5c1e16be0844"). InnerVolumeSpecName "kube-api-access-lq7rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.840893 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bba0b30-30c2-4170-9455-5c1e16be0844" (UID: "3bba0b30-30c2-4170-9455-5c1e16be0844"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.841970 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-config-data" (OuterVolumeSpecName: "config-data") pod "3bba0b30-30c2-4170-9455-5c1e16be0844" (UID: "3bba0b30-30c2-4170-9455-5c1e16be0844"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.914015 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.914062 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.914078 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq7rh\" (UniqueName: \"kubernetes.io/projected/3bba0b30-30c2-4170-9455-5c1e16be0844-kube-api-access-lq7rh\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:12 crc kubenswrapper[4990]: I1003 10:05:12.914092 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bba0b30-30c2-4170-9455-5c1e16be0844-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.274152 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gwhsf" event={"ID":"3bba0b30-30c2-4170-9455-5c1e16be0844","Type":"ContainerDied","Data":"e400a222c9d229c06f13e5063aa9e6facccf23d8b48714de6ca34d9e23762779"} Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.274191 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e400a222c9d229c06f13e5063aa9e6facccf23d8b48714de6ca34d9e23762779" Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.274226 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gwhsf" Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.432769 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.433002 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c29b310f-27f8-49df-a6a8-bd2a589cd807" containerName="nova-scheduler-scheduler" containerID="cri-o://1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e" gracePeriod=30 Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.444368 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.444603 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-log" containerID="cri-o://9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3" gracePeriod=30 Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.444671 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-api" containerID="cri-o://65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c" gracePeriod=30 Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.506340 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.506622 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-log" containerID="cri-o://4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09" gracePeriod=30 Oct 03 10:05:13 crc kubenswrapper[4990]: I1003 10:05:13.506742 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-metadata" containerID="cri-o://7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd" gracePeriod=30 Oct 03 10:05:14 crc kubenswrapper[4990]: I1003 10:05:14.287403 4990 generic.go:334] "Generic (PLEG): container finished" podID="8933274f-a924-476d-805b-1a4764530a52" containerID="9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3" exitCode=143 Oct 03 10:05:14 crc kubenswrapper[4990]: I1003 10:05:14.287552 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8933274f-a924-476d-805b-1a4764530a52","Type":"ContainerDied","Data":"9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3"} Oct 03 10:05:14 crc kubenswrapper[4990]: I1003 10:05:14.290456 4990 generic.go:334] "Generic (PLEG): container finished" podID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerID="4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09" exitCode=143 Oct 03 10:05:14 crc kubenswrapper[4990]: I1003 10:05:14.290522 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d34b5bf1-bd99-4b36-b73f-3b0cc470315a","Type":"ContainerDied","Data":"4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09"} Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.703803 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.866938 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-combined-ca-bundle\") pod \"c29b310f-27f8-49df-a6a8-bd2a589cd807\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.867141 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-config-data\") pod \"c29b310f-27f8-49df-a6a8-bd2a589cd807\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.867215 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzqr9\" (UniqueName: \"kubernetes.io/projected/c29b310f-27f8-49df-a6a8-bd2a589cd807-kube-api-access-rzqr9\") pod \"c29b310f-27f8-49df-a6a8-bd2a589cd807\" (UID: \"c29b310f-27f8-49df-a6a8-bd2a589cd807\") " Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.873355 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29b310f-27f8-49df-a6a8-bd2a589cd807-kube-api-access-rzqr9" (OuterVolumeSpecName: "kube-api-access-rzqr9") pod "c29b310f-27f8-49df-a6a8-bd2a589cd807" (UID: "c29b310f-27f8-49df-a6a8-bd2a589cd807"). InnerVolumeSpecName "kube-api-access-rzqr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.895577 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c29b310f-27f8-49df-a6a8-bd2a589cd807" (UID: "c29b310f-27f8-49df-a6a8-bd2a589cd807"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.900122 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-config-data" (OuterVolumeSpecName: "config-data") pod "c29b310f-27f8-49df-a6a8-bd2a589cd807" (UID: "c29b310f-27f8-49df-a6a8-bd2a589cd807"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.969385 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.969429 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzqr9\" (UniqueName: \"kubernetes.io/projected/c29b310f-27f8-49df-a6a8-bd2a589cd807-kube-api-access-rzqr9\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:15 crc kubenswrapper[4990]: I1003 10:05:15.969447 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29b310f-27f8-49df-a6a8-bd2a589cd807-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.310314 4990 generic.go:334] "Generic (PLEG): container finished" podID="c29b310f-27f8-49df-a6a8-bd2a589cd807" containerID="1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e" exitCode=0 Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.310476 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.310465 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c29b310f-27f8-49df-a6a8-bd2a589cd807","Type":"ContainerDied","Data":"1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e"} Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.311226 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c29b310f-27f8-49df-a6a8-bd2a589cd807","Type":"ContainerDied","Data":"12065c8373202c09bbb1e03d9c083c4a81b36634f7e709e114ad425b73bd0866"} Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.311253 4990 scope.go:117] "RemoveContainer" containerID="1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.344821 4990 scope.go:117] "RemoveContainer" containerID="1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.345450 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:05:16 crc kubenswrapper[4990]: E1003 10:05:16.346654 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e\": container with ID starting with 1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e not found: ID does not exist" containerID="1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.346691 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e"} err="failed to get container status \"1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e\": rpc error: code = NotFound desc = could not find container \"1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e\": container with ID starting with 1229b4c30a6344fa4aa2d296bcd4f26419595ffdd86356892af6b01796699d7e not found: ID does not exist" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.368548 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.381019 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:05:16 crc kubenswrapper[4990]: E1003 10:05:16.381468 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bba0b30-30c2-4170-9455-5c1e16be0844" containerName="nova-manage" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.381491 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bba0b30-30c2-4170-9455-5c1e16be0844" containerName="nova-manage" Oct 03 10:05:16 crc kubenswrapper[4990]: E1003 10:05:16.381532 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerName="dnsmasq-dns" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.381540 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerName="dnsmasq-dns" Oct 03 10:05:16 crc kubenswrapper[4990]: E1003 10:05:16.381553 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerName="init" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.381560 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerName="init" Oct 03 10:05:16 crc kubenswrapper[4990]: E1003 10:05:16.381595 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29b310f-27f8-49df-a6a8-bd2a589cd807" containerName="nova-scheduler-scheduler" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.381603 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29b310f-27f8-49df-a6a8-bd2a589cd807" containerName="nova-scheduler-scheduler" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.381901 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ec452a-525c-458f-b876-dbef8ff507f4" containerName="dnsmasq-dns" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.381934 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29b310f-27f8-49df-a6a8-bd2a589cd807" containerName="nova-scheduler-scheduler" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.381953 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bba0b30-30c2-4170-9455-5c1e16be0844" containerName="nova-manage" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.382639 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.385859 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.391706 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.482250 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd97w\" (UniqueName: \"kubernetes.io/projected/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-kube-api-access-qd97w\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.482353 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-config-data\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.482409 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.584619 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd97w\" (UniqueName: \"kubernetes.io/projected/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-kube-api-access-qd97w\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.585177 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-config-data\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.586558 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.590603 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.592932 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-config-data\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.602252 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd97w\" (UniqueName: \"kubernetes.io/projected/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-kube-api-access-qd97w\") pod \"nova-scheduler-0\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.651409 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:47394->10.217.0.194:8775: read: connection reset by peer" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.651467 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:47402->10.217.0.194:8775: read: connection reset by peer" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.707622 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.884631 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29b310f-27f8-49df-a6a8-bd2a589cd807" path="/var/lib/kubelet/pods/c29b310f-27f8-49df-a6a8-bd2a589cd807/volumes" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.915228 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.992495 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mclcx\" (UniqueName: \"kubernetes.io/projected/8933274f-a924-476d-805b-1a4764530a52-kube-api-access-mclcx\") pod \"8933274f-a924-476d-805b-1a4764530a52\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.992593 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-public-tls-certs\") pod \"8933274f-a924-476d-805b-1a4764530a52\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.992648 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8933274f-a924-476d-805b-1a4764530a52-logs\") pod \"8933274f-a924-476d-805b-1a4764530a52\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.992676 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-internal-tls-certs\") pod \"8933274f-a924-476d-805b-1a4764530a52\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.992856 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-config-data\") pod \"8933274f-a924-476d-805b-1a4764530a52\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.992882 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-combined-ca-bundle\") pod \"8933274f-a924-476d-805b-1a4764530a52\" (UID: \"8933274f-a924-476d-805b-1a4764530a52\") " Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.994974 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8933274f-a924-476d-805b-1a4764530a52-logs" (OuterVolumeSpecName: "logs") pod "8933274f-a924-476d-805b-1a4764530a52" (UID: "8933274f-a924-476d-805b-1a4764530a52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:05:16 crc kubenswrapper[4990]: I1003 10:05:16.998587 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8933274f-a924-476d-805b-1a4764530a52-kube-api-access-mclcx" (OuterVolumeSpecName: "kube-api-access-mclcx") pod "8933274f-a924-476d-805b-1a4764530a52" (UID: "8933274f-a924-476d-805b-1a4764530a52"). InnerVolumeSpecName "kube-api-access-mclcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.026418 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-config-data" (OuterVolumeSpecName: "config-data") pod "8933274f-a924-476d-805b-1a4764530a52" (UID: "8933274f-a924-476d-805b-1a4764530a52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.035790 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8933274f-a924-476d-805b-1a4764530a52" (UID: "8933274f-a924-476d-805b-1a4764530a52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.047800 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8933274f-a924-476d-805b-1a4764530a52" (UID: "8933274f-a924-476d-805b-1a4764530a52"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.053674 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8933274f-a924-476d-805b-1a4764530a52" (UID: "8933274f-a924-476d-805b-1a4764530a52"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.062242 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.096035 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.096313 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8933274f-a924-476d-805b-1a4764530a52-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.096327 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.096339 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.096351 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8933274f-a924-476d-805b-1a4764530a52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.096365 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mclcx\" (UniqueName: \"kubernetes.io/projected/8933274f-a924-476d-805b-1a4764530a52-kube-api-access-mclcx\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.198083 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-logs\") pod \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.198164 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-combined-ca-bundle\") pod \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.198260 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-config-data\") pod \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.198281 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbwpj\" (UniqueName: \"kubernetes.io/projected/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-kube-api-access-kbwpj\") pod \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.198330 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-nova-metadata-tls-certs\") pod \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\" (UID: \"d34b5bf1-bd99-4b36-b73f-3b0cc470315a\") " Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.198494 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-logs" (OuterVolumeSpecName: "logs") pod "d34b5bf1-bd99-4b36-b73f-3b0cc470315a" (UID: "d34b5bf1-bd99-4b36-b73f-3b0cc470315a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.198710 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.201412 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-kube-api-access-kbwpj" (OuterVolumeSpecName: "kube-api-access-kbwpj") pod "d34b5bf1-bd99-4b36-b73f-3b0cc470315a" (UID: "d34b5bf1-bd99-4b36-b73f-3b0cc470315a"). InnerVolumeSpecName "kube-api-access-kbwpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.229073 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d34b5bf1-bd99-4b36-b73f-3b0cc470315a" (UID: "d34b5bf1-bd99-4b36-b73f-3b0cc470315a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.252954 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.262876 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-config-data" (OuterVolumeSpecName: "config-data") pod "d34b5bf1-bd99-4b36-b73f-3b0cc470315a" (UID: "d34b5bf1-bd99-4b36-b73f-3b0cc470315a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.295178 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d34b5bf1-bd99-4b36-b73f-3b0cc470315a" (UID: "d34b5bf1-bd99-4b36-b73f-3b0cc470315a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.300387 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.300426 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbwpj\" (UniqueName: \"kubernetes.io/projected/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-kube-api-access-kbwpj\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.300442 4990 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.300461 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34b5bf1-bd99-4b36-b73f-3b0cc470315a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.347674 4990 generic.go:334] "Generic (PLEG): container finished" podID="8933274f-a924-476d-805b-1a4764530a52" containerID="65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c" exitCode=0 Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.347887 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.347948 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8933274f-a924-476d-805b-1a4764530a52","Type":"ContainerDied","Data":"65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c"} Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.348020 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8933274f-a924-476d-805b-1a4764530a52","Type":"ContainerDied","Data":"10bbe2a48dee7a9c6851c10f419a6f513a6e5d9a57f03edd2f2cddcdfa46acc7"} Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.348057 4990 scope.go:117] "RemoveContainer" containerID="65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.353859 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90","Type":"ContainerStarted","Data":"066fbe0f8a16662f328dda1a940e255130607cbaf18ba4213d35893e4bfd8ccf"} Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.356904 4990 generic.go:334] "Generic (PLEG): container finished" podID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerID="7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd" exitCode=0 Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.356943 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d34b5bf1-bd99-4b36-b73f-3b0cc470315a","Type":"ContainerDied","Data":"7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd"} Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.357216 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d34b5bf1-bd99-4b36-b73f-3b0cc470315a","Type":"ContainerDied","Data":"feec9417985c2fa032eef5e30d8b80c22eeebd8cf0a4ccee41751acdf18c427c"} Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.357012 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.379070 4990 scope.go:117] "RemoveContainer" containerID="9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.390112 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.417533 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.432082 4990 scope.go:117] "RemoveContainer" containerID="65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c" Oct 03 10:05:17 crc kubenswrapper[4990]: E1003 10:05:17.433913 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c\": container with ID starting with 65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c not found: ID does not exist" containerID="65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.433956 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c"} err="failed to get container status \"65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c\": rpc error: code = NotFound desc = could not find container \"65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c\": container with ID starting with 65ca245066b8905a6a06316c2b1a84930df72a07b47cafcf9e84626b84fc103c not found: ID does not exist" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.433986 4990 scope.go:117] "RemoveContainer" containerID="9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3" Oct 03 10:05:17 crc kubenswrapper[4990]: E1003 10:05:17.434864 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3\": container with ID starting with 9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3 not found: ID does not exist" containerID="9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.434890 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3"} err="failed to get container status \"9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3\": rpc error: code = NotFound desc = could not find container \"9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3\": container with ID starting with 9b03896e70cab0936fd6c80c7c6440079308bc8ecb4cc780af712e2cfe9478c3 not found: ID does not exist" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.434906 4990 scope.go:117] "RemoveContainer" containerID="7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.450793 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: E1003 10:05:17.451535 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-api" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.451555 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-api" Oct 03 10:05:17 crc kubenswrapper[4990]: E1003 10:05:17.451580 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-log" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.451588 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-log" Oct 03 10:05:17 crc kubenswrapper[4990]: E1003 10:05:17.451600 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-log" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.451606 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-log" Oct 03 10:05:17 crc kubenswrapper[4990]: E1003 10:05:17.451633 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-metadata" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.451639 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-metadata" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.451826 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-metadata" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.451847 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-api" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.451865 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8933274f-a924-476d-805b-1a4764530a52" containerName="nova-api-log" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.451877 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" containerName="nova-metadata-log" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.453126 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.454829 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.456384 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.456449 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.459873 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.472891 4990 scope.go:117] "RemoveContainer" containerID="4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.484349 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.492444 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.498578 4990 scope.go:117] "RemoveContainer" containerID="7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd" Oct 03 10:05:17 crc kubenswrapper[4990]: E1003 10:05:17.499002 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd\": container with ID starting with 7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd not found: ID does not exist" containerID="7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.499044 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd"} err="failed to get container status \"7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd\": rpc error: code = NotFound desc = could not find container \"7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd\": container with ID starting with 7e44023bb09bc56c5a426cc890a98697a9191b96787f0ad96aed2cdc5021e9dd not found: ID does not exist" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.499078 4990 scope.go:117] "RemoveContainer" containerID="4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09" Oct 03 10:05:17 crc kubenswrapper[4990]: E1003 10:05:17.499421 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09\": container with ID starting with 4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09 not found: ID does not exist" containerID="4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.499451 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09"} err="failed to get container status \"4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09\": rpc error: code = NotFound desc = could not find container \"4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09\": container with ID starting with 4440e240a820fe495394b77548e8895b349535c7ba1f01dca4612b461d1e5d09 not found: ID does not exist" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.501349 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.504499 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.507592 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.507955 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.512651 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.607860 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-config-data\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.607904 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.607935 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.607968 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gq4\" (UniqueName: \"kubernetes.io/projected/f251a942-6e8b-4f2e-a6e8-b505e4921b19-kube-api-access-j4gq4\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.607990 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-config-data\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.608060 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.608090 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-public-tls-certs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.608120 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251a942-6e8b-4f2e-a6e8-b505e4921b19-logs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.608138 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.608224 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvm5b\" (UniqueName: \"kubernetes.io/projected/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-kube-api-access-xvm5b\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.608276 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-logs\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.709971 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.711643 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-config-data\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.712195 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.712323 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4gq4\" (UniqueName: \"kubernetes.io/projected/f251a942-6e8b-4f2e-a6e8-b505e4921b19-kube-api-access-j4gq4\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.712440 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-config-data\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.712620 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.712798 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-public-tls-certs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.712886 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251a942-6e8b-4f2e-a6e8-b505e4921b19-logs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.713000 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.713178 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvm5b\" (UniqueName: \"kubernetes.io/projected/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-kube-api-access-xvm5b\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.713297 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-logs\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.713331 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251a942-6e8b-4f2e-a6e8-b505e4921b19-logs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.713747 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-logs\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.715204 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.716832 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-public-tls-certs\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.717064 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-config-data\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.717073 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.717195 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.717388 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-config-data\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.719346 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.732070 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4gq4\" (UniqueName: \"kubernetes.io/projected/f251a942-6e8b-4f2e-a6e8-b505e4921b19-kube-api-access-j4gq4\") pod \"nova-api-0\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.732611 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvm5b\" (UniqueName: \"kubernetes.io/projected/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-kube-api-access-xvm5b\") pod \"nova-metadata-0\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " pod="openstack/nova-metadata-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.782737 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:05:17 crc kubenswrapper[4990]: I1003 10:05:17.819569 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:05:18 crc kubenswrapper[4990]: I1003 10:05:18.251050 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:18 crc kubenswrapper[4990]: W1003 10:05:18.254877 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf251a942_6e8b_4f2e_a6e8_b505e4921b19.slice/crio-e041a5113858480ecf9548c85d70cb54ae3feab2b955a40eda4bb8b2fae3153f WatchSource:0}: Error finding container e041a5113858480ecf9548c85d70cb54ae3feab2b955a40eda4bb8b2fae3153f: Status 404 returned error can't find the container with id e041a5113858480ecf9548c85d70cb54ae3feab2b955a40eda4bb8b2fae3153f Oct 03 10:05:18 crc kubenswrapper[4990]: I1003 10:05:18.346030 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:05:18 crc kubenswrapper[4990]: W1003 10:05:18.363059 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbafd4ca6_6b2d_4c8e_b285_e5b29d2f4507.slice/crio-5a13fac76cfd5dcbf371253b3e501d4b6fe86385968822d4cfdeee3500f015a1 WatchSource:0}: Error finding container 5a13fac76cfd5dcbf371253b3e501d4b6fe86385968822d4cfdeee3500f015a1: Status 404 returned error can't find the container with id 5a13fac76cfd5dcbf371253b3e501d4b6fe86385968822d4cfdeee3500f015a1 Oct 03 10:05:18 crc kubenswrapper[4990]: I1003 10:05:18.373490 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f251a942-6e8b-4f2e-a6e8-b505e4921b19","Type":"ContainerStarted","Data":"e041a5113858480ecf9548c85d70cb54ae3feab2b955a40eda4bb8b2fae3153f"} Oct 03 10:05:18 crc kubenswrapper[4990]: I1003 10:05:18.375537 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90","Type":"ContainerStarted","Data":"6adc7ab75463d0f987dece3ce8e23ed148319e34a206c17402e0f133ac4e50a8"} Oct 03 10:05:18 crc kubenswrapper[4990]: I1003 10:05:18.390660 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.390582405 podStartE2EDuration="2.390582405s" podCreationTimestamp="2025-10-03 10:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:05:18.387141617 +0000 UTC m=+1300.183773484" watchObservedRunningTime="2025-10-03 10:05:18.390582405 +0000 UTC m=+1300.187214262" Oct 03 10:05:18 crc kubenswrapper[4990]: I1003 10:05:18.886067 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8933274f-a924-476d-805b-1a4764530a52" path="/var/lib/kubelet/pods/8933274f-a924-476d-805b-1a4764530a52/volumes" Oct 03 10:05:18 crc kubenswrapper[4990]: I1003 10:05:18.887331 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34b5bf1-bd99-4b36-b73f-3b0cc470315a" path="/var/lib/kubelet/pods/d34b5bf1-bd99-4b36-b73f-3b0cc470315a/volumes" Oct 03 10:05:19 crc kubenswrapper[4990]: I1003 10:05:19.387196 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507","Type":"ContainerStarted","Data":"e50c76114e970456369b10767db9ea929de83f5f9f49ec0e5b445f383d8fe4e0"} Oct 03 10:05:19 crc kubenswrapper[4990]: I1003 10:05:19.387245 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507","Type":"ContainerStarted","Data":"5c91cb2d9f2d0bdc8ec0998044ab7005f99f9cf45e79613caddeaf65988ab63e"} Oct 03 10:05:19 crc kubenswrapper[4990]: I1003 10:05:19.387255 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507","Type":"ContainerStarted","Data":"5a13fac76cfd5dcbf371253b3e501d4b6fe86385968822d4cfdeee3500f015a1"} Oct 03 10:05:19 crc kubenswrapper[4990]: I1003 10:05:19.390852 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f251a942-6e8b-4f2e-a6e8-b505e4921b19","Type":"ContainerStarted","Data":"f7d3bf56d474b40153f72d5bc27483a5ac6dcc6d775e56fb75b8e6acd314a813"} Oct 03 10:05:19 crc kubenswrapper[4990]: I1003 10:05:19.390887 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f251a942-6e8b-4f2e-a6e8-b505e4921b19","Type":"ContainerStarted","Data":"209b540fbfc89585fd9ea919729898d162e3563717dbbf86e50b0e3bf191be0e"} Oct 03 10:05:19 crc kubenswrapper[4990]: I1003 10:05:19.412877 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.412856065 podStartE2EDuration="2.412856065s" podCreationTimestamp="2025-10-03 10:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:05:19.403745953 +0000 UTC m=+1301.200377830" watchObservedRunningTime="2025-10-03 10:05:19.412856065 +0000 UTC m=+1301.209487922" Oct 03 10:05:19 crc kubenswrapper[4990]: I1003 10:05:19.438200 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.438181189 podStartE2EDuration="2.438181189s" podCreationTimestamp="2025-10-03 10:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:05:19.427754614 +0000 UTC m=+1301.224386471" watchObservedRunningTime="2025-10-03 10:05:19.438181189 +0000 UTC m=+1301.234813036" Oct 03 10:05:21 crc kubenswrapper[4990]: I1003 10:05:21.707920 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 10:05:22 crc kubenswrapper[4990]: I1003 10:05:22.820208 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 10:05:22 crc kubenswrapper[4990]: I1003 10:05:22.820300 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 10:05:25 crc kubenswrapper[4990]: I1003 10:05:25.303833 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:05:25 crc kubenswrapper[4990]: I1003 10:05:25.304187 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:05:26 crc kubenswrapper[4990]: I1003 10:05:26.710738 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 10:05:26 crc kubenswrapper[4990]: I1003 10:05:26.738379 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 10:05:27 crc kubenswrapper[4990]: I1003 10:05:27.498882 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 10:05:27 crc kubenswrapper[4990]: I1003 10:05:27.784060 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 10:05:27 crc kubenswrapper[4990]: I1003 10:05:27.784187 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 10:05:27 crc kubenswrapper[4990]: I1003 10:05:27.820293 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 10:05:27 crc kubenswrapper[4990]: I1003 10:05:27.820373 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 10:05:28 crc kubenswrapper[4990]: I1003 10:05:28.796657 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 10:05:28 crc kubenswrapper[4990]: I1003 10:05:28.796671 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 10:05:28 crc kubenswrapper[4990]: I1003 10:05:28.832721 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 10:05:28 crc kubenswrapper[4990]: I1003 10:05:28.833042 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 10:05:29 crc kubenswrapper[4990]: I1003 10:05:29.480270 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.790888 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.791341 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.791751 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.792084 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.796568 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.798484 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.833018 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.833288 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.841818 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 10:05:37 crc kubenswrapper[4990]: I1003 10:05:37.848018 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.144396 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement20bd-account-delete-4jvmv"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.146289 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement20bd-account-delete-4jvmv" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.165380 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.165681 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a6db8725-5245-49dc-8bbc-9b8741622c42" containerName="openstackclient" containerID="cri-o://61366d13516cfa8aeb4e891ab816effe7f676485e8ed9d36f65358ce050b6251" gracePeriod=2 Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.183505 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.208997 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement20bd-account-delete-4jvmv"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.251236 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.251909 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="openstack-network-exporter" containerID="cri-o://78d0f733b310c1f6130bc24b6979ef796f0479532e62ab932da054c013922271" gracePeriod=300 Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.288548 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.306010 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.306062 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.306114 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.306853 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d16d23839f71b10620c63e3b9b22fb6701868b850a588763048d7da4f3291db7"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.306912 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://d16d23839f71b10620c63e3b9b22fb6701868b850a588763048d7da4f3291db7" gracePeriod=600 Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.347352 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbhpt\" (UniqueName: \"kubernetes.io/projected/72c4e608-45c9-447a-802d-dc405aac76e4-kube-api-access-zbhpt\") pod \"placement20bd-account-delete-4jvmv\" (UID: \"72c4e608-45c9-447a-802d-dc405aac76e4\") " pod="openstack/placement20bd-account-delete-4jvmv" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.380249 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance5d27-account-delete-qb6r6"] Oct 03 10:05:55 crc kubenswrapper[4990]: E1003 10:05:55.380876 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6db8725-5245-49dc-8bbc-9b8741622c42" containerName="openstackclient" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.380891 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6db8725-5245-49dc-8bbc-9b8741622c42" containerName="openstackclient" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.381114 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6db8725-5245-49dc-8bbc-9b8741622c42" containerName="openstackclient" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.381958 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance5d27-account-delete-qb6r6" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.403816 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="ovsdbserver-sb" containerID="cri-o://2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf" gracePeriod=300 Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.411180 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance5d27-account-delete-qb6r6"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.455357 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbhpt\" (UniqueName: \"kubernetes.io/projected/72c4e608-45c9-447a-802d-dc405aac76e4-kube-api-access-zbhpt\") pod \"placement20bd-account-delete-4jvmv\" (UID: \"72c4e608-45c9-447a-802d-dc405aac76e4\") " pod="openstack/placement20bd-account-delete-4jvmv" Oct 03 10:05:55 crc kubenswrapper[4990]: E1003 10:05:55.460685 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 10:05:55 crc kubenswrapper[4990]: E1003 10:05:55.460749 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data podName:f6624a04-5ca4-4651-a91e-0a67f97c51b5 nodeName:}" failed. No retries permitted until 2025-10-03 10:05:55.96073302 +0000 UTC m=+1337.757364877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5") : configmap "rabbitmq-cell1-config-data" not found Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.466536 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinderdff5-account-delete-qtjtl"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.473012 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderdff5-account-delete-qtjtl" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.511545 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbhpt\" (UniqueName: \"kubernetes.io/projected/72c4e608-45c9-447a-802d-dc405aac76e4-kube-api-access-zbhpt\") pod \"placement20bd-account-delete-4jvmv\" (UID: \"72c4e608-45c9-447a-802d-dc405aac76e4\") " pod="openstack/placement20bd-account-delete-4jvmv" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.525935 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ckmxp"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.538041 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ckmxp"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.558473 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vz76\" (UniqueName: \"kubernetes.io/projected/825c3741-d390-4a7c-b3a6-50e268fbe712-kube-api-access-6vz76\") pod \"cinderdff5-account-delete-qtjtl\" (UID: \"825c3741-d390-4a7c-b3a6-50e268fbe712\") " pod="openstack/cinderdff5-account-delete-qtjtl" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.558693 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vktcj\" (UniqueName: \"kubernetes.io/projected/6fad2c33-3d08-4ab8-91b2-dea27b8dc05c-kube-api-access-vktcj\") pod \"glance5d27-account-delete-qb6r6\" (UID: \"6fad2c33-3d08-4ab8-91b2-dea27b8dc05c\") " pod="openstack/glance5d27-account-delete-qb6r6" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.633732 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderdff5-account-delete-qtjtl"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.650370 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-4h7gg"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.661992 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-4h7gg"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.663102 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vktcj\" (UniqueName: \"kubernetes.io/projected/6fad2c33-3d08-4ab8-91b2-dea27b8dc05c-kube-api-access-vktcj\") pod \"glance5d27-account-delete-qb6r6\" (UID: \"6fad2c33-3d08-4ab8-91b2-dea27b8dc05c\") " pod="openstack/glance5d27-account-delete-qb6r6" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.663237 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vz76\" (UniqueName: \"kubernetes.io/projected/825c3741-d390-4a7c-b3a6-50e268fbe712-kube-api-access-6vz76\") pod \"cinderdff5-account-delete-qtjtl\" (UID: \"825c3741-d390-4a7c-b3a6-50e268fbe712\") " pod="openstack/cinderdff5-account-delete-qtjtl" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.682669 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.688360 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vz76\" (UniqueName: \"kubernetes.io/projected/825c3741-d390-4a7c-b3a6-50e268fbe712-kube-api-access-6vz76\") pod \"cinderdff5-account-delete-qtjtl\" (UID: \"825c3741-d390-4a7c-b3a6-50e268fbe712\") " pod="openstack/cinderdff5-account-delete-qtjtl" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.696061 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron7b28-account-delete-bg9sn"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.696723 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vktcj\" (UniqueName: \"kubernetes.io/projected/6fad2c33-3d08-4ab8-91b2-dea27b8dc05c-kube-api-access-vktcj\") pod \"glance5d27-account-delete-qb6r6\" (UID: \"6fad2c33-3d08-4ab8-91b2-dea27b8dc05c\") " pod="openstack/glance5d27-account-delete-qb6r6" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.697666 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7b28-account-delete-bg9sn" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.709482 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron7b28-account-delete-bg9sn"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.758577 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6wbg7"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.765858 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6wbg7"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.766684 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvlsr\" (UniqueName: \"kubernetes.io/projected/c64d7262-ab55-4d88-bb9c-02825e07721a-kube-api-access-nvlsr\") pod \"neutron7b28-account-delete-bg9sn\" (UID: \"c64d7262-ab55-4d88-bb9c-02825e07721a\") " pod="openstack/neutron7b28-account-delete-bg9sn" Oct 03 10:05:55 crc kubenswrapper[4990]: E1003 10:05:55.769712 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 10:05:55 crc kubenswrapper[4990]: E1003 10:05:55.769763 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data podName:51461d28-e850-4ba3-8f27-0252b51903f1 nodeName:}" failed. No retries permitted until 2025-10-03 10:05:56.269747759 +0000 UTC m=+1338.066379616 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data") pod "rabbitmq-server-0" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1") : configmap "rabbitmq-config-data" not found Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.772486 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement20bd-account-delete-4jvmv" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.787347 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jwpr6"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.791894 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e/ovsdbserver-sb/0.log" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.792166 4990 generic.go:334] "Generic (PLEG): container finished" podID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerID="78d0f733b310c1f6130bc24b6979ef796f0479532e62ab932da054c013922271" exitCode=2 Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.792245 4990 generic.go:334] "Generic (PLEG): container finished" podID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerID="2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf" exitCode=143 Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.792649 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e","Type":"ContainerDied","Data":"78d0f733b310c1f6130bc24b6979ef796f0479532e62ab932da054c013922271"} Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.792707 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e","Type":"ContainerDied","Data":"2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf"} Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.795070 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jwpr6"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.808935 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="d16d23839f71b10620c63e3b9b22fb6701868b850a588763048d7da4f3291db7" exitCode=0 Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.808985 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"d16d23839f71b10620c63e3b9b22fb6701868b850a588763048d7da4f3291db7"} Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.809020 4990 scope.go:117] "RemoveContainer" containerID="175c8521bcb98f1fac547c4c077e9a09006d6ca72a50d133b713f7d6d049ebb8" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.823473 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance5d27-account-delete-qb6r6" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.824013 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderdff5-account-delete-qtjtl" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.874800 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvlsr\" (UniqueName: \"kubernetes.io/projected/c64d7262-ab55-4d88-bb9c-02825e07721a-kube-api-access-nvlsr\") pod \"neutron7b28-account-delete-bg9sn\" (UID: \"c64d7262-ab55-4d88-bb9c-02825e07721a\") " pod="openstack/neutron7b28-account-delete-bg9sn" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.905937 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvlsr\" (UniqueName: \"kubernetes.io/projected/c64d7262-ab55-4d88-bb9c-02825e07721a-kube-api-access-nvlsr\") pod \"neutron7b28-account-delete-bg9sn\" (UID: \"c64d7262-ab55-4d88-bb9c-02825e07721a\") " pod="openstack/neutron7b28-account-delete-bg9sn" Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.941504 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-68zd7"] Oct 03 10:05:55 crc kubenswrapper[4990]: I1003 10:05:55.941728 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-68zd7" podUID="463798bd-8799-4206-bf0c-b2f62f1fc1d0" containerName="openstack-network-exporter" containerID="cri-o://2f089b7e26c6c70346708ab3abe1a9903c0d6d3655a4a9350c20b8a22252b418" gracePeriod=30 Oct 03 10:05:55 crc kubenswrapper[4990]: E1003 10:05:55.943491 4990 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openstack/ovndbcluster-nb-etc-ovn-ovsdbserver-nb-0: PVC is being deleted" pod="openstack/ovsdbserver-nb-0" volumeName="ovndbcluster-nb-etc-ovn" Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.006236 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.006297 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:05:56.506281876 +0000 UTC m=+1338.302913733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.010903 4990 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.010959 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:05:56.510936934 +0000 UTC m=+1338.307568791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.011701 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.011742 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data podName:f6624a04-5ca4-4651-a91e-0a67f97c51b5 nodeName:}" failed. No retries permitted until 2025-10-03 10:05:57.011733354 +0000 UTC m=+1338.808365211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5") : configmap "rabbitmq-cell1-config-data" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.012841 4990 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.012887 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:05:56.512878513 +0000 UTC m=+1338.309510370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-httpd-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.013840 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.013875 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:05:56.513861148 +0000 UTC m=+1338.310493005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-scripts" not found Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.055483 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell05001-account-delete-5nncz"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.063444 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell05001-account-delete-5nncz" Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.117905 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell05001-account-delete-5nncz"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.164995 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7b28-account-delete-bg9sn" Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.181412 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4xcn\" (UniqueName: \"kubernetes.io/projected/5d8359ce-901e-400a-926a-b3060c2dc789-kube-api-access-m4xcn\") pod \"novacell05001-account-delete-5nncz\" (UID: \"5d8359ce-901e-400a-926a-b3060c2dc789\") " pod="openstack/novacell05001-account-delete-5nncz" Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.292489 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4xcn\" (UniqueName: \"kubernetes.io/projected/5d8359ce-901e-400a-926a-b3060c2dc789-kube-api-access-m4xcn\") pod \"novacell05001-account-delete-5nncz\" (UID: \"5d8359ce-901e-400a-926a-b3060c2dc789\") " pod="openstack/novacell05001-account-delete-5nncz" Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.292792 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.292848 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data podName:51461d28-e850-4ba3-8f27-0252b51903f1 nodeName:}" failed. No retries permitted until 2025-10-03 10:05:57.292829884 +0000 UTC m=+1339.089461741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data") pod "rabbitmq-server-0" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1") : configmap "rabbitmq-config-data" not found Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.297610 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.324106 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nfzkg"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.357013 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4xcn\" (UniqueName: \"kubernetes.io/projected/5d8359ce-901e-400a-926a-b3060c2dc789-kube-api-access-m4xcn\") pod \"novacell05001-account-delete-5nncz\" (UID: \"5d8359ce-901e-400a-926a-b3060c2dc789\") " pod="openstack/novacell05001-account-delete-5nncz" Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.403400 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-zxxk7"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.433034 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell05001-account-delete-5nncz" Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.446951 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-766c7d6945-qr5ht"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.447230 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" podUID="d7c15976-1c83-43b6-8077-6af8ecc010dc" containerName="dnsmasq-dns" containerID="cri-o://24f083714d9fadf03161dcac11ebfaf3e1e57373e8285bfc2cfe8dda04b30139" gracePeriod=10 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.527140 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-59f897c554-pws5p"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.527397 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-59f897c554-pws5p" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerName="placement-log" containerID="cri-o://71ede97e001d771329948c2371baea1801f48fce90e28d8bf76d142a1956d199" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.527846 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-59f897c554-pws5p" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerName="placement-api" containerID="cri-o://eca91cab1b64d609b0395ae540c0b555039475f0cf2f921543fe267f50e40ea2" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.541416 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542237 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-server" containerID="cri-o://92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542369 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="swift-recon-cron" containerID="cri-o://5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542413 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="rsync" containerID="cri-o://540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542454 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-expirer" containerID="cri-o://80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542521 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-updater" containerID="cri-o://21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542569 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-auditor" containerID="cri-o://7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542613 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-replicator" containerID="cri-o://9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542655 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-server" containerID="cri-o://18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542698 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-updater" containerID="cri-o://6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542738 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-auditor" containerID="cri-o://d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542809 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-reaper" containerID="cri-o://751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542878 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-server" containerID="cri-o://c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542913 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-auditor" containerID="cri-o://be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542896 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-replicator" containerID="cri-o://e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.542879 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-replicator" containerID="cri-o://005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4" gracePeriod=30 Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.591774 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-d5h6v"] Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.610562 4990 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.610855 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:05:57.610838792 +0000 UTC m=+1339.407470639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.611272 4990 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.611370 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:05:57.611359956 +0000 UTC m=+1339.407991813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-httpd-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.611461 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.611570 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:05:57.611561891 +0000 UTC m=+1339.408193748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-scripts" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.611649 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: E1003 10:05:56.611731 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:05:57.611713075 +0000 UTC m=+1339.408344932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-config" not found Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.642500 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-d5h6v"] Oct 03 10:05:56 crc kubenswrapper[4990]: I1003 10:05:56.863301 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vgv57"] Oct 03 10:05:57 crc kubenswrapper[4990]: E1003 10:05:56.942829 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf is running failed: container process not found" containerID="2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:56.943165 4990 generic.go:334] "Generic (PLEG): container finished" podID="d7c15976-1c83-43b6-8077-6af8ecc010dc" containerID="24f083714d9fadf03161dcac11ebfaf3e1e57373e8285bfc2cfe8dda04b30139" exitCode=0 Oct 03 10:05:57 crc kubenswrapper[4990]: E1003 10:05:56.943616 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf is running failed: container process not found" containerID="2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 10:05:57 crc kubenswrapper[4990]: E1003 10:05:56.944366 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf is running failed: container process not found" containerID="2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 10:05:57 crc kubenswrapper[4990]: E1003 10:05:56.944397 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="ovsdbserver-sb" Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:56.977115 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-68zd7_463798bd-8799-4206-bf0c-b2f62f1fc1d0/openstack-network-exporter/0.log" Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:56.977160 4990 generic.go:334] "Generic (PLEG): container finished" podID="463798bd-8799-4206-bf0c-b2f62f1fc1d0" containerID="2f089b7e26c6c70346708ab3abe1a9903c0d6d3655a4a9350c20b8a22252b418" exitCode=2 Oct 03 10:05:57 crc kubenswrapper[4990]: E1003 10:05:57.042484 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 10:05:57 crc kubenswrapper[4990]: E1003 10:05:57.042567 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data podName:f6624a04-5ca4-4651-a91e-0a67f97c51b5 nodeName:}" failed. No retries permitted until 2025-10-03 10:05:59.042549113 +0000 UTC m=+1340.839180970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5") : configmap "rabbitmq-cell1-config-data" not found Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:57.078177 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="ovn-northd" containerID="cri-o://7871e236e74cb6ae1f5cad66ad4b89c2125e25150e40edb22d50c75bed041cb2" gracePeriod=30 Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:57.079245 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="openstack-network-exporter" containerID="cri-o://6b3ab8b29ea9f1b7b6e8f40edcf60f82f96ba6bf30e0f0a4afbe62bd168e8f7a" gracePeriod=30 Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:57.188902 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="ovn-northd" probeResult="failure" output=< Oct 03 10:05:57 crc kubenswrapper[4990]: 2025-10-03T10:05:57Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Oct 03 10:05:57 crc kubenswrapper[4990]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Oct 03 10:05:57 crc kubenswrapper[4990]: > Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:57.219755 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="ovn-northd" probeResult="failure" output=< Oct 03 10:05:57 crc kubenswrapper[4990]: 2025-10-03T10:05:57Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Oct 03 10:05:57 crc kubenswrapper[4990]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Oct 03 10:05:57 crc kubenswrapper[4990]: > Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:57.332389 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="294e0d0f-2fbd-42f1-90ff-af3c4188f2f2" path="/var/lib/kubelet/pods/294e0d0f-2fbd-42f1-90ff-af3c4188f2f2/volumes" Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:57.341013 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32563036-2ae2-4a96-8e50-94100964fd6d" path="/var/lib/kubelet/pods/32563036-2ae2-4a96-8e50-94100964fd6d/volumes" Oct 03 10:05:57 crc kubenswrapper[4990]: I1003 10:05:57.342242 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8a19e1-8b84-48a7-8dde-f22078695aa9" path="/var/lib/kubelet/pods/5c8a19e1-8b84-48a7-8dde-f22078695aa9/volumes" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.342997 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48916be-afdb-47ca-8eed-d7ad817883b3" path="/var/lib/kubelet/pods/b48916be-afdb-47ca-8eed-d7ad817883b3/volumes" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.347769 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca95bca6-8a90-4f5e-a615-ac88ab3b1be7" path="/var/lib/kubelet/pods/ca95bca6-8a90-4f5e-a615-ac88ab3b1be7/volumes" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348455 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vgv57"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348481 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348524 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348541 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348556 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" event={"ID":"d7c15976-1c83-43b6-8077-6af8ecc010dc","Type":"ContainerDied","Data":"24f083714d9fadf03161dcac11ebfaf3e1e57373e8285bfc2cfe8dda04b30139"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348569 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-68zd7" event={"ID":"463798bd-8799-4206-bf0c-b2f62f1fc1d0","Type":"ContainerDied","Data":"2f089b7e26c6c70346708ab3abe1a9903c0d6d3655a4a9350c20b8a22252b418"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348579 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348590 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348601 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348613 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement20bd-account-delete-4jvmv"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348624 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xq2vm"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348639 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xq2vm"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348653 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gwhsf"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348666 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gwhsf"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348681 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-20bd-account-create-dxfch"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348693 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-20bd-account-create-dxfch"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348705 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-n8kfh"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348717 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-n8kfh"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348728 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5d27-account-create-qmpzq"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348738 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5d27-account-create-qmpzq"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348748 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bb2c-account-create-cxvp9"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348757 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rxq97"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.348979 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api-log" containerID="cri-o://3eac1b1500d053663746a028e055851b9a9cb1d22a20f4428e053e6a9d74b23f" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.349292 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4064799a-3601-4426-a225-151729d11c97" containerName="cinder-scheduler" containerID="cri-o://20d2d25ce9bcc245f400e40ad8c0300acef06aacb84e0cd069565068a37a1714" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.349770 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerName="glance-log" containerID="cri-o://b9379e1b8b4ae7edb841ea4eecf2f3729d7382db2b0cb9c51bafb1386c50c7f8" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.349844 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4064799a-3601-4426-a225-151729d11c97" containerName="probe" containerID="cri-o://603eafaafed14cad283cbaa3059780f5a1da6d31cc82934c8ad3431436581baf" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.349924 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerName="openstack-network-exporter" containerID="cri-o://98c8679f7b257dccb8f8856102144602d80294d3dcc4917701e8537d48ac3f47" gracePeriod=300 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.350031 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerName="glance-httpd" containerID="cri-o://de83fa39f0019c8400c38ca9ffa8825e384c428252e8724d9b3c68ce695b214d" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.350422 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api" containerID="cri-o://5af4d58eb46b303caa40337b751ef9459fc70d7f1edca5370cef7c7ece2d18ae" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.352062 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.352134 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data podName:51461d28-e850-4ba3-8f27-0252b51903f1 nodeName:}" failed. No retries permitted until 2025-10-03 10:05:59.352116726 +0000 UTC m=+1341.148748583 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data") pod "rabbitmq-server-0" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1") : configmap "rabbitmq-config-data" not found Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.355095 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="afea80e6-894d-41cd-b107-926d012e9f35" containerName="glance-log" containerID="cri-o://15879b52a5447c4f1318e71b94d644d6fa2c2d384c4a618bb98821e956651178" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.355234 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="afea80e6-894d-41cd-b107-926d012e9f35" containerName="glance-httpd" containerID="cri-o://88d641d1bf486138095cda080880426b764b3efa9d3577891d4ca40c95c3393d" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.356176 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement20bd-account-delete-4jvmv"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.370626 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5gv6z"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.393307 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5gv6z"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.410969 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bb2c-account-create-cxvp9"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.418050 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rxq97"] Oct 03 10:05:58 crc kubenswrapper[4990]: W1003 10:05:57.424325 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72c4e608_45c9_447a_802d_dc405aac76e4.slice/crio-442f039f1f256adc71d52f276dbaff52186937021957da32696bfa28f88beb33 WatchSource:0}: Error finding container 442f039f1f256adc71d52f276dbaff52186937021957da32696bfa28f88beb33: Status 404 returned error can't find the container with id 442f039f1f256adc71d52f276dbaff52186937021957da32696bfa28f88beb33 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.437077 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-dff5-account-create-cf9kx"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.451920 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerName="ovsdbserver-nb" containerID="cri-o://8d47e9542f5010045e9f1e66e2c4ba83ad379324fa4cf15c59ee6b012997be8d" gracePeriod=300 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.478642 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance5d27-account-delete-qb6r6"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.527786 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-dff5-account-create-cf9kx"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.538242 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tl56s"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.550603 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tl56s"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.555559 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6778f9d745-ft6gs"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.555880 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6778f9d745-ft6gs" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerName="neutron-api" containerID="cri-o://c1e969493c2c4af3c85759ecb6094d97806d479c080aac159e4825c9e30be4a7" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.556328 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6778f9d745-ft6gs" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerName="neutron-httpd" containerID="cri-o://275691f37f6f67a2de209a5a1eaf8c7ee5473a0d43e2287e05d49c37f8c3fa41" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.576337 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderdff5-account-delete-qtjtl"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.580860 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.602919 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.618206 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.618762 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-log" containerID="cri-o://209b540fbfc89585fd9ea919729898d162e3563717dbbf86e50b0e3bf191be0e" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.618911 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-api" containerID="cri-o://f7d3bf56d474b40153f72d5bc27483a5ac6dcc6d775e56fb75b8e6acd314a813" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.626602 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.642949 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6958c"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.659256 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6958c"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.665688 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8499569686-hgsxg"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.665972 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerName="barbican-keystone-listener-log" containerID="cri-o://09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.666405 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerName="barbican-keystone-listener" containerID="cri-o://bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.667943 4990 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.667990 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:05:59.66797322 +0000 UTC m=+1341.464605077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-config" not found Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.668076 4990 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.668107 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:05:59.668099203 +0000 UTC m=+1341.464731060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-httpd-config" not found Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.668161 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.668191 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:05:59.668184295 +0000 UTC m=+1341.464816152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-scripts" not found Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.668760 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.668830 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:05:59.668814011 +0000 UTC m=+1341.465445868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-config" not found Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.674710 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" containerID="cri-o://7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" gracePeriod=29 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.675023 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-64b566fdb9-7b8mq"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.675201 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-64b566fdb9-7b8mq" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerName="barbican-worker-log" containerID="cri-o://9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.676912 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-64b566fdb9-7b8mq" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerName="barbican-worker" containerID="cri-o://a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.696077 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="51461d28-e850-4ba3-8f27-0252b51903f1" containerName="rabbitmq" containerID="cri-o://6e738f47b293c30c60c0c8652362ba0397b75b7dc42d631b6865d77252a55f68" gracePeriod=604800 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.697590 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c48a-account-create-pc4h8"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.718042 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" containerName="rabbitmq" containerID="cri-o://39ca9311a2a836a4c7c4e3966e9e9682edd65a8a88709915dec6b0db579a3d63" gracePeriod=604800 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.719826 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c48a-account-create-pc4h8"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.733607 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bm57g"] Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.738027 4990 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 03 10:05:58 crc kubenswrapper[4990]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 10:05:58 crc kubenswrapper[4990]: + source /usr/local/bin/container-scripts/functions Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNBridge=br-int Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNRemote=tcp:localhost:6642 Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNEncapType=geneve Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNAvailabilityZones= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ EnableChassisAsGateway=true Oct 03 10:05:58 crc kubenswrapper[4990]: ++ PhysicalNetworks= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNHostName= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 10:05:58 crc kubenswrapper[4990]: ++ ovs_dir=/var/lib/openvswitch Oct 03 10:05:58 crc kubenswrapper[4990]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 10:05:58 crc kubenswrapper[4990]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 10:05:58 crc kubenswrapper[4990]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + sleep 0.5 Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + sleep 0.5 Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + cleanup_ovsdb_server_semaphore Oct 03 10:05:58 crc kubenswrapper[4990]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 10:05:58 crc kubenswrapper[4990]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 10:05:58 crc kubenswrapper[4990]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-zxxk7" message=< Oct 03 10:05:58 crc kubenswrapper[4990]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 10:05:58 crc kubenswrapper[4990]: + source /usr/local/bin/container-scripts/functions Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNBridge=br-int Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNRemote=tcp:localhost:6642 Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNEncapType=geneve Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNAvailabilityZones= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ EnableChassisAsGateway=true Oct 03 10:05:58 crc kubenswrapper[4990]: ++ PhysicalNetworks= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNHostName= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 10:05:58 crc kubenswrapper[4990]: ++ ovs_dir=/var/lib/openvswitch Oct 03 10:05:58 crc kubenswrapper[4990]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 10:05:58 crc kubenswrapper[4990]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 10:05:58 crc kubenswrapper[4990]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + sleep 0.5 Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + sleep 0.5 Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + cleanup_ovsdb_server_semaphore Oct 03 10:05:58 crc kubenswrapper[4990]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 10:05:58 crc kubenswrapper[4990]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 10:05:58 crc kubenswrapper[4990]: > Oct 03 10:05:58 crc kubenswrapper[4990]: E1003 10:05:57.738075 4990 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 03 10:05:58 crc kubenswrapper[4990]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 10:05:58 crc kubenswrapper[4990]: + source /usr/local/bin/container-scripts/functions Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNBridge=br-int Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNRemote=tcp:localhost:6642 Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNEncapType=geneve Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNAvailabilityZones= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ EnableChassisAsGateway=true Oct 03 10:05:58 crc kubenswrapper[4990]: ++ PhysicalNetworks= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ OVNHostName= Oct 03 10:05:58 crc kubenswrapper[4990]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 10:05:58 crc kubenswrapper[4990]: ++ ovs_dir=/var/lib/openvswitch Oct 03 10:05:58 crc kubenswrapper[4990]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 10:05:58 crc kubenswrapper[4990]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 10:05:58 crc kubenswrapper[4990]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + sleep 0.5 Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + sleep 0.5 Oct 03 10:05:58 crc kubenswrapper[4990]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 10:05:58 crc kubenswrapper[4990]: + cleanup_ovsdb_server_semaphore Oct 03 10:05:58 crc kubenswrapper[4990]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 10:05:58 crc kubenswrapper[4990]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 10:05:58 crc kubenswrapper[4990]: > pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" containerID="cri-o://554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.738127 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" containerID="cri-o://554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" gracePeriod=29 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.739905 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bm57g"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.758334 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.758688 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-log" containerID="cri-o://5c91cb2d9f2d0bdc8ec0998044ab7005f99f9cf45e79613caddeaf65988ab63e" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.759293 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-metadata" containerID="cri-o://e50c76114e970456369b10767db9ea929de83f5f9f49ec0e5b445f383d8fe4e0" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.777044 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fc5d-account-create-kwx27"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.788179 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fc5d-account-create-kwx27"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.795959 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7ff64b77bd-5qpwf"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.796227 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7ff64b77bd-5qpwf" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api-log" containerID="cri-o://dd31dfd8ea225e9fe468b78acfe34ecc14ae0b5990b5d3fa0f4371f27dad6f4c" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.797236 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7ff64b77bd-5qpwf" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api" containerID="cri-o://2f9c4a7944eae6a958448c1d084cbafac9f94669c8762fb52bb88d1d7c1f256d" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.818102 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.818391 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d5b89027-cf5d-4807-adc3-b4915304f1f2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5d577a29b7ee9a04a2b18df67e6f481c57ad2dbffb24a317b5c6b4aaad21f535" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.826818 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-67mbn"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.833067 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-67mbn"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.840903 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.852198 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="ac157dc7-6df6-4f4f-ba65-c85b58f78fff" containerName="nova-cell1-conductor-conductor" containerID="cri-o://46e13f097dbadede2c3cc71a0e4ec9fe6d9a4c7164ffcc20da40533122fd3bc4" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.852481 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e/ovsdbserver-sb/0.log" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.852614 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.859439 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.859645 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="85167add-e116-4e56-950b-fe0a6a553732" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.869319 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p67ht"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.873576 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p67ht"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.976307 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-config\") pod \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.976403 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdbserver-sb-tls-certs\") pod \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.976458 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.976755 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-metrics-certs-tls-certs\") pod \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.976813 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdb-rundir\") pod \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.976847 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-scripts\") pod \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.976902 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x47v\" (UniqueName: \"kubernetes.io/projected/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-kube-api-access-7x47v\") pod \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.976982 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-combined-ca-bundle\") pod \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\" (UID: \"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.977039 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-config" (OuterVolumeSpecName: "config") pod "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" (UID: "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.977439 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-scripts" (OuterVolumeSpecName: "scripts") pod "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" (UID: "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.983175 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" (UID: "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.985119 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" (UID: "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:57.985133 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-kube-api-access-7x47v" (OuterVolumeSpecName: "kube-api-access-7x47v") pod "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" (UID: "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e"). InnerVolumeSpecName "kube-api-access-7x47v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.003660 4990 generic.go:334] "Generic (PLEG): container finished" podID="b23a7883-8397-4262-a891-916de94739fd" containerID="6b3ab8b29ea9f1b7b6e8f40edcf60f82f96ba6bf30e0f0a4afbe62bd168e8f7a" exitCode=2 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.003711 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b23a7883-8397-4262-a891-916de94739fd","Type":"ContainerDied","Data":"6b3ab8b29ea9f1b7b6e8f40edcf60f82f96ba6bf30e0f0a4afbe62bd168e8f7a"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.009530 4990 generic.go:334] "Generic (PLEG): container finished" podID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerID="b9379e1b8b4ae7edb841ea4eecf2f3729d7382db2b0cb9c51bafb1386c50c7f8" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.009653 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971e9963-b7ee-4ee8-872a-2f696bbfdb40","Type":"ContainerDied","Data":"b9379e1b8b4ae7edb841ea4eecf2f3729d7382db2b0cb9c51bafb1386c50c7f8"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.017096 4990 generic.go:334] "Generic (PLEG): container finished" podID="a6db8725-5245-49dc-8bbc-9b8741622c42" containerID="61366d13516cfa8aeb4e891ab816effe7f676485e8ed9d36f65358ce050b6251" exitCode=137 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.023642 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" (UID: "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.028920 4990 generic.go:334] "Generic (PLEG): container finished" podID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerID="275691f37f6f67a2de209a5a1eaf8c7ee5473a0d43e2287e05d49c37f8c3fa41" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.029008 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6778f9d745-ft6gs" event={"ID":"44d77d08-ad9c-4524-8b12-3d9d204aaf1c","Type":"ContainerDied","Data":"275691f37f6f67a2de209a5a1eaf8c7ee5473a0d43e2287e05d49c37f8c3fa41"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.036566 4990 generic.go:334] "Generic (PLEG): container finished" podID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerID="71ede97e001d771329948c2371baea1801f48fce90e28d8bf76d142a1956d199" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.036676 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f897c554-pws5p" event={"ID":"5f14bb4e-f980-48fb-bba4-c068419b1975","Type":"ContainerDied","Data":"71ede97e001d771329948c2371baea1801f48fce90e28d8bf76d142a1956d199"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.038868 4990 generic.go:334] "Generic (PLEG): container finished" podID="afea80e6-894d-41cd-b107-926d012e9f35" containerID="15879b52a5447c4f1318e71b94d644d6fa2c2d384c4a618bb98821e956651178" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.038934 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afea80e6-894d-41cd-b107-926d012e9f35","Type":"ContainerDied","Data":"15879b52a5447c4f1318e71b94d644d6fa2c2d384c4a618bb98821e956651178"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.050582 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b682e49-2ca7-4692-b989-28dfbd26163e","Type":"ContainerDied","Data":"3eac1b1500d053663746a028e055851b9a9cb1d22a20f4428e053e6a9d74b23f"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.050499 4990 generic.go:334] "Generic (PLEG): container finished" podID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerID="3eac1b1500d053663746a028e055851b9a9cb1d22a20f4428e053e6a9d74b23f" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.053141 4990 generic.go:334] "Generic (PLEG): container finished" podID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerID="9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.053197 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64b566fdb9-7b8mq" event={"ID":"1021ae3d-46d5-481e-b844-9086f9d8f946","Type":"ContainerDied","Data":"9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.055190 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement20bd-account-delete-4jvmv" event={"ID":"72c4e608-45c9-447a-802d-dc405aac76e4","Type":"ContainerStarted","Data":"7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.055233 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement20bd-account-delete-4jvmv" event={"ID":"72c4e608-45c9-447a-802d-dc405aac76e4","Type":"ContainerStarted","Data":"442f039f1f256adc71d52f276dbaff52186937021957da32696bfa28f88beb33"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.055352 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement20bd-account-delete-4jvmv" podUID="72c4e608-45c9-447a-802d-dc405aac76e4" containerName="mariadb-account-delete" containerID="cri-o://7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.064361 4990 generic.go:334] "Generic (PLEG): container finished" podID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerID="5c91cb2d9f2d0bdc8ec0998044ab7005f99f9cf45e79613caddeaf65988ab63e" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.064411 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507","Type":"ContainerDied","Data":"5c91cb2d9f2d0bdc8ec0998044ab7005f99f9cf45e79613caddeaf65988ab63e"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.067418 4990 generic.go:334] "Generic (PLEG): container finished" podID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerID="09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.067480 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" event={"ID":"e56fc3e5-d30b-4486-978a-46a13a5657e6","Type":"ContainerDied","Data":"09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.076195 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" (UID: "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.079837 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement20bd-account-delete-4jvmv" podStartSLOduration=3.079816554 podStartE2EDuration="3.079816554s" podCreationTimestamp="2025-10-03 10:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:05:58.076076329 +0000 UTC m=+1339.872708186" watchObservedRunningTime="2025-10-03 10:05:58.079816554 +0000 UTC m=+1339.876448411" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.081926 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.081957 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.081969 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.081978 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.081987 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x47v\" (UniqueName: \"kubernetes.io/projected/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-kube-api-access-7x47v\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.081995 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.082004 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.082740 4990 generic.go:334] "Generic (PLEG): container finished" podID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.082816 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zxxk7" event={"ID":"ea0bd28b-825b-4ba5-8838-f3bc695b0613","Type":"ContainerDied","Data":"554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.095638 4990 generic.go:334] "Generic (PLEG): container finished" podID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerID="dd31dfd8ea225e9fe468b78acfe34ecc14ae0b5990b5d3fa0f4371f27dad6f4c" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.095704 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff64b77bd-5qpwf" event={"ID":"ebb4021b-c9ef-4b31-864d-d4874b51e47c","Type":"ContainerDied","Data":"dd31dfd8ea225e9fe468b78acfe34ecc14ae0b5990b5d3fa0f4371f27dad6f4c"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.099242 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9/ovsdbserver-nb/0.log" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.099270 4990 generic.go:334] "Generic (PLEG): container finished" podID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerID="98c8679f7b257dccb8f8856102144602d80294d3dcc4917701e8537d48ac3f47" exitCode=2 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.099282 4990 generic.go:334] "Generic (PLEG): container finished" podID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerID="8d47e9542f5010045e9f1e66e2c4ba83ad379324fa4cf15c59ee6b012997be8d" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.099321 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9","Type":"ContainerDied","Data":"98c8679f7b257dccb8f8856102144602d80294d3dcc4917701e8537d48ac3f47"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.099343 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9","Type":"ContainerDied","Data":"8d47e9542f5010045e9f1e66e2c4ba83ad379324fa4cf15c59ee6b012997be8d"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.105436 4990 generic.go:334] "Generic (PLEG): container finished" podID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerID="209b540fbfc89585fd9ea919729898d162e3563717dbbf86e50b0e3bf191be0e" exitCode=143 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.105583 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f251a942-6e8b-4f2e-a6e8-b505e4921b19","Type":"ContainerDied","Data":"209b540fbfc89585fd9ea919729898d162e3563717dbbf86e50b0e3bf191be0e"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.106298 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" (UID: "f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.110348 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120264 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120292 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120299 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120306 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120312 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120318 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120335 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120388 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120404 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120416 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120428 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120439 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.120452 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123190 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123362 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123428 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123437 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123446 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123556 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123565 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123573 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301" exitCode=0 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.123834 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.126623 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.126638 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.126651 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.126661 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.126669 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.126677 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.135847 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e/ovsdbserver-sb/0.log" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.138309 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.138695 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e","Type":"ContainerDied","Data":"52be642a2c45cc7f0625fa92fcdfe338f57666d0243ebafce92d0592d6b6a529"} Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.138795 4990 scope.go:117] "RemoveContainer" containerID="78d0f733b310c1f6130bc24b6979ef796f0479532e62ab932da054c013922271" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.169392 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.169636 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" containerName="nova-scheduler-scheduler" containerID="cri-o://6adc7ab75463d0f987dece3ce8e23ed148319e34a206c17402e0f133ac4e50a8" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.190351 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.190373 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.195112 4990 scope.go:117] "RemoveContainer" containerID="2be3b6abb0f0add59f72fbde1757f2bf6a1e9d1a5f0e78f3628af871852f41bf" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.212342 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.217684 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.476604 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="16a22247-2803-4910-a44a-9ccba673c2cf" containerName="galera" containerID="cri-o://daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22" gracePeriod=30 Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.702329 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="d5b89027-cf5d-4807-adc3-b4915304f1f2" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.198:6080/vnc_lite.html\": dial tcp 10.217.0.198:6080: connect: connection refused" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.800003 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.816610 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-68zd7_463798bd-8799-4206-bf0c-b2f62f1fc1d0/openstack-network-exporter/0.log" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.816679 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.825978 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9/ovsdbserver-nb/0.log" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.826258 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.843832 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847198 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-metrics-certs-tls-certs\") pod \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847281 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-sb\") pod \"d7c15976-1c83-43b6-8077-6af8ecc010dc\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847332 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-combined-ca-bundle\") pod \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847358 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-metrics-certs-tls-certs\") pod \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847412 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-config\") pod \"d7c15976-1c83-43b6-8077-6af8ecc010dc\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847447 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-svc\") pod \"d7c15976-1c83-43b6-8077-6af8ecc010dc\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847481 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-combined-ca-bundle\") pod \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847527 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovn-rundir\") pod \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847549 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdbserver-nb-tls-certs\") pod \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847585 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovs-rundir\") pod \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847638 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-swift-storage-0\") pod \"d7c15976-1c83-43b6-8077-6af8ecc010dc\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847690 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-config\") pod \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847712 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463798bd-8799-4206-bf0c-b2f62f1fc1d0-config\") pod \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847738 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-scripts\") pod \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847776 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdb-rundir\") pod \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847807 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz2px\" (UniqueName: \"kubernetes.io/projected/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-kube-api-access-zz2px\") pod \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847841 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfvx7\" (UniqueName: \"kubernetes.io/projected/d7c15976-1c83-43b6-8077-6af8ecc010dc-kube-api-access-lfvx7\") pod \"d7c15976-1c83-43b6-8077-6af8ecc010dc\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847860 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55jhr\" (UniqueName: \"kubernetes.io/projected/463798bd-8799-4206-bf0c-b2f62f1fc1d0-kube-api-access-55jhr\") pod \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\" (UID: \"463798bd-8799-4206-bf0c-b2f62f1fc1d0\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847917 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-nb\") pod \"d7c15976-1c83-43b6-8077-6af8ecc010dc\" (UID: \"d7c15976-1c83-43b6-8077-6af8ecc010dc\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847954 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\" (UID: \"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9\") " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.847950 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "463798bd-8799-4206-bf0c-b2f62f1fc1d0" (UID: "463798bd-8799-4206-bf0c-b2f62f1fc1d0"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.848492 4990 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.850788 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" (UID: "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.868521 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463798bd-8799-4206-bf0c-b2f62f1fc1d0-config" (OuterVolumeSpecName: "config") pod "463798bd-8799-4206-bf0c-b2f62f1fc1d0" (UID: "463798bd-8799-4206-bf0c-b2f62f1fc1d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.869314 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" (UID: "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.870660 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-config" (OuterVolumeSpecName: "config") pod "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" (UID: "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.870740 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "463798bd-8799-4206-bf0c-b2f62f1fc1d0" (UID: "463798bd-8799-4206-bf0c-b2f62f1fc1d0"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.879725 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-scripts" (OuterVolumeSpecName: "scripts") pod "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" (UID: "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.887244 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463798bd-8799-4206-bf0c-b2f62f1fc1d0-kube-api-access-55jhr" (OuterVolumeSpecName: "kube-api-access-55jhr") pod "463798bd-8799-4206-bf0c-b2f62f1fc1d0" (UID: "463798bd-8799-4206-bf0c-b2f62f1fc1d0"). InnerVolumeSpecName "kube-api-access-55jhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.925822 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-kube-api-access-zz2px" (OuterVolumeSpecName: "kube-api-access-zz2px") pod "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" (UID: "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9"). InnerVolumeSpecName "kube-api-access-zz2px". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.952286 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/463798bd-8799-4206-bf0c-b2f62f1fc1d0-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.952324 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.952333 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/463798bd-8799-4206-bf0c-b2f62f1fc1d0-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.952342 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.952353 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.952362 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz2px\" (UniqueName: \"kubernetes.io/projected/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-kube-api-access-zz2px\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.952372 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55jhr\" (UniqueName: \"kubernetes.io/projected/463798bd-8799-4206-bf0c-b2f62f1fc1d0-kube-api-access-55jhr\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.952396 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.992036 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e549854-6717-4898-a5ee-aca6972206a7" path="/var/lib/kubelet/pods/1e549854-6717-4898-a5ee-aca6972206a7/volumes" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.993447 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ccfb8f0-6106-4c88-87eb-78e0e8481518" path="/var/lib/kubelet/pods/2ccfb8f0-6106-4c88-87eb-78e0e8481518/volumes" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.996877 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c15976-1c83-43b6-8077-6af8ecc010dc-kube-api-access-lfvx7" (OuterVolumeSpecName: "kube-api-access-lfvx7") pod "d7c15976-1c83-43b6-8077-6af8ecc010dc" (UID: "d7c15976-1c83-43b6-8077-6af8ecc010dc"). InnerVolumeSpecName "kube-api-access-lfvx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.997225 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3609b86b-bac8-4a2b-a96c-9e8a317deecf" path="/var/lib/kubelet/pods/3609b86b-bac8-4a2b-a96c-9e8a317deecf/volumes" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.997927 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372689e3-d306-4bc7-86eb-b44920f77a78" path="/var/lib/kubelet/pods/372689e3-d306-4bc7-86eb-b44920f77a78/volumes" Oct 03 10:05:58 crc kubenswrapper[4990]: I1003 10:05:58.998551 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bba0b30-30c2-4170-9455-5c1e16be0844" path="/var/lib/kubelet/pods/3bba0b30-30c2-4170-9455-5c1e16be0844/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.004136 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae7b8b5-463c-4844-9b60-1e87c5681693" path="/var/lib/kubelet/pods/4ae7b8b5-463c-4844-9b60-1e87c5681693/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.005788 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="544a2832-1a0d-4251-8087-8321f2f24908" path="/var/lib/kubelet/pods/544a2832-1a0d-4251-8087-8321f2f24908/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.006358 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a464354-7171-4c4d-8d90-cdd1e8d35803" path="/var/lib/kubelet/pods/7a464354-7171-4c4d-8d90-cdd1e8d35803/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.020866 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e71897-19ff-43cc-86cb-e2c5c2bce780" path="/var/lib/kubelet/pods/88e71897-19ff-43cc-86cb-e2c5c2bce780/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.021566 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b6639ec-52e5-498c-a2b3-f2d615b15c60" path="/var/lib/kubelet/pods/9b6639ec-52e5-498c-a2b3-f2d615b15c60/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.022259 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746" path="/var/lib/kubelet/pods/a91f0bdc-c2dc-42e1-9a2e-c2b7b45c1746/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.022954 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45912f0-ed72-4751-9543-bacd041baba0" path="/var/lib/kubelet/pods/b45912f0-ed72-4751-9543-bacd041baba0/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.031392 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc84de3-47d9-42cd-9f9d-f4bd014d57ef" path="/var/lib/kubelet/pods/bbc84de3-47d9-42cd-9f9d-f4bd014d57ef/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.034470 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf25005-f050-4d64-bbeb-19faa36c9d43" path="/var/lib/kubelet/pods/bbf25005-f050-4d64-bbeb-19faa36c9d43/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.058953 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config-secret\") pod \"a6db8725-5245-49dc-8bbc-9b8741622c42\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.068687 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-combined-ca-bundle\") pod \"a6db8725-5245-49dc-8bbc-9b8741622c42\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.068769 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpdnf\" (UniqueName: \"kubernetes.io/projected/a6db8725-5245-49dc-8bbc-9b8741622c42-kube-api-access-vpdnf\") pod \"a6db8725-5245-49dc-8bbc-9b8741622c42\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.069024 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config\") pod \"a6db8725-5245-49dc-8bbc-9b8741622c42\" (UID: \"a6db8725-5245-49dc-8bbc-9b8741622c42\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.069747 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfvx7\" (UniqueName: \"kubernetes.io/projected/d7c15976-1c83-43b6-8077-6af8ecc010dc-kube-api-access-lfvx7\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.071560 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d21cb3a0-498d-4edf-b411-8a13ae88e221" path="/var/lib/kubelet/pods/d21cb3a0-498d-4edf-b411-8a13ae88e221/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.071964 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.072026 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data podName:f6624a04-5ca4-4651-a91e-0a67f97c51b5 nodeName:}" failed. No retries permitted until 2025-10-03 10:06:03.0720057 +0000 UTC m=+1344.868637557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5") : configmap "rabbitmq-cell1-config-data" not found Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.078840 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement20bd-account-delete-4jvmv" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.098989 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" path="/var/lib/kubelet/pods/f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.100222 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d" path="/var/lib/kubelet/pods/fe39ff15-8ec8-49b9-b69d-4f17d64c6e9d/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.104497 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb8b5ec-6556-4999-a5a9-4f1e22dc4140" path="/var/lib/kubelet/pods/feb8b5ec-6556-4999-a5a9-4f1e22dc4140/volumes" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.108563 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6db8725-5245-49dc-8bbc-9b8741622c42-kube-api-access-vpdnf" (OuterVolumeSpecName: "kube-api-access-vpdnf") pod "a6db8725-5245-49dc-8bbc-9b8741622c42" (UID: "a6db8725-5245-49dc-8bbc-9b8741622c42"). InnerVolumeSpecName "kube-api-access-vpdnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.136928 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance5d27-account-delete-qb6r6"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.155723 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" (UID: "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.174660 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbhpt\" (UniqueName: \"kubernetes.io/projected/72c4e608-45c9-447a-802d-dc405aac76e4-kube-api-access-zbhpt\") pod \"72c4e608-45c9-447a-802d-dc405aac76e4\" (UID: \"72c4e608-45c9-447a-802d-dc405aac76e4\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.174840 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderdff5-account-delete-qtjtl"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.175330 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpdnf\" (UniqueName: \"kubernetes.io/projected/a6db8725-5245-49dc-8bbc-9b8741622c42-kube-api-access-vpdnf\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.175361 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.183867 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c4e608-45c9-447a-802d-dc405aac76e4-kube-api-access-zbhpt" (OuterVolumeSpecName: "kube-api-access-zbhpt") pod "72c4e608-45c9-447a-802d-dc405aac76e4" (UID: "72c4e608-45c9-447a-802d-dc405aac76e4"). InnerVolumeSpecName "kube-api-access-zbhpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.191074 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-68zd7_463798bd-8799-4206-bf0c-b2f62f1fc1d0/openstack-network-exporter/0.log" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.191334 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-68zd7" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.192022 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7c15976-1c83-43b6-8077-6af8ecc010dc" (UID: "d7c15976-1c83-43b6-8077-6af8ecc010dc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.192096 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-68zd7" event={"ID":"463798bd-8799-4206-bf0c-b2f62f1fc1d0","Type":"ContainerDied","Data":"a75c9c338529a422f529a516561d3b80f75053c6e8fca4729272fbe4fc46be90"} Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.192164 4990 scope.go:117] "RemoveContainer" containerID="2f089b7e26c6c70346708ab3abe1a9903c0d6d3655a4a9350c20b8a22252b418" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.215306 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5fb98f794c-zgdcx"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.215903 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5fb98f794c-zgdcx" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-httpd" containerID="cri-o://fbd93aafb30841d9e18e1ff1f7a28d4ee830685ba26f421b05af7ea452b37f7c" gracePeriod=30 Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.216397 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5fb98f794c-zgdcx" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-server" containerID="cri-o://abcf8fd5281bc2aae312ef1aab5f533fd6ed7860e53c537ab6d669fe5cb383bc" gracePeriod=30 Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.217902 4990 generic.go:334] "Generic (PLEG): container finished" podID="4064799a-3601-4426-a225-151729d11c97" containerID="603eafaafed14cad283cbaa3059780f5a1da6d31cc82934c8ad3431436581baf" exitCode=0 Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.217974 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4064799a-3601-4426-a225-151729d11c97","Type":"ContainerDied","Data":"603eafaafed14cad283cbaa3059780f5a1da6d31cc82934c8ad3431436581baf"} Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.270595 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "463798bd-8799-4206-bf0c-b2f62f1fc1d0" (UID: "463798bd-8799-4206-bf0c-b2f62f1fc1d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.277391 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.278109 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbhpt\" (UniqueName: \"kubernetes.io/projected/72c4e608-45c9-447a-802d-dc405aac76e4-kube-api-access-zbhpt\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.278126 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.278864 4990 generic.go:334] "Generic (PLEG): container finished" podID="d5b89027-cf5d-4807-adc3-b4915304f1f2" containerID="5d577a29b7ee9a04a2b18df67e6f481c57ad2dbffb24a317b5c6b4aaad21f535" exitCode=0 Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.278953 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5b89027-cf5d-4807-adc3-b4915304f1f2","Type":"ContainerDied","Data":"5d577a29b7ee9a04a2b18df67e6f481c57ad2dbffb24a317b5c6b4aaad21f535"} Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.300919 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9/ovsdbserver-nb/0.log" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.301011 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9","Type":"ContainerDied","Data":"8a0ede082177925400825a20468a7f0674e16bf187f695f71d978a0215d1e517"} Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.301059 4990 scope.go:117] "RemoveContainer" containerID="98c8679f7b257dccb8f8856102144602d80294d3dcc4917701e8537d48ac3f47" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.301542 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.312257 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.317455 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" event={"ID":"d7c15976-1c83-43b6-8077-6af8ecc010dc","Type":"ContainerDied","Data":"db978c5278aadddd79c7a6ab5b1e6c320d71f5d5fefbe60a9435be597490149c"} Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.317604 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-766c7d6945-qr5ht" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.321987 4990 generic.go:334] "Generic (PLEG): container finished" podID="72c4e608-45c9-447a-802d-dc405aac76e4" containerID="7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4" exitCode=0 Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.322078 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement20bd-account-delete-4jvmv" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.322676 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement20bd-account-delete-4jvmv" event={"ID":"72c4e608-45c9-447a-802d-dc405aac76e4","Type":"ContainerDied","Data":"7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4"} Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.322724 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement20bd-account-delete-4jvmv" event={"ID":"72c4e608-45c9-447a-802d-dc405aac76e4","Type":"ContainerDied","Data":"442f039f1f256adc71d52f276dbaff52186937021957da32696bfa28f88beb33"} Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.334258 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.337013 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance5d27-account-delete-qb6r6" event={"ID":"6fad2c33-3d08-4ab8-91b2-dea27b8dc05c","Type":"ContainerStarted","Data":"cadd33a72d00696b6fb2ac6058a0851b9770eb2e7bfc0b4a8c3ab84c6b2d4c31"} Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.354602 4990 scope.go:117] "RemoveContainer" containerID="8d47e9542f5010045e9f1e66e2c4ba83ad379324fa4cf15c59ee6b012997be8d" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.379474 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.379569 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.379614 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data podName:51461d28-e850-4ba3-8f27-0252b51903f1 nodeName:}" failed. No retries permitted until 2025-10-03 10:06:03.379600664 +0000 UTC m=+1345.176232521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data") pod "rabbitmq-server-0" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1") : configmap "rabbitmq-config-data" not found Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.419667 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement20bd-account-delete-4jvmv"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.421210 4990 scope.go:117] "RemoveContainer" containerID="24f083714d9fadf03161dcac11ebfaf3e1e57373e8285bfc2cfe8dda04b30139" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.424884 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a6db8725-5245-49dc-8bbc-9b8741622c42" (UID: "a6db8725-5245-49dc-8bbc-9b8741622c42"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.429627 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement20bd-account-delete-4jvmv"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.483179 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.505860 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron7b28-account-delete-bg9sn"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.519535 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell05001-account-delete-5nncz"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.520897 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a6db8725-5245-49dc-8bbc-9b8741622c42" (UID: "a6db8725-5245-49dc-8bbc-9b8741622c42"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.531778 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6db8725-5245-49dc-8bbc-9b8741622c42" (UID: "a6db8725-5245-49dc-8bbc-9b8741622c42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.532494 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-config" (OuterVolumeSpecName: "config") pod "d7c15976-1c83-43b6-8077-6af8ecc010dc" (UID: "d7c15976-1c83-43b6-8077-6af8ecc010dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.556671 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" (UID: "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.559572 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7c15976-1c83-43b6-8077-6af8ecc010dc" (UID: "d7c15976-1c83-43b6-8077-6af8ecc010dc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.573807 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7c15976-1c83-43b6-8077-6af8ecc010dc" (UID: "d7c15976-1c83-43b6-8077-6af8ecc010dc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.590977 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.591020 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a6db8725-5245-49dc-8bbc-9b8741622c42-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.591034 4990 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.591049 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6db8725-5245-49dc-8bbc-9b8741622c42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.591061 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.591073 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.606872 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "463798bd-8799-4206-bf0c-b2f62f1fc1d0" (UID: "463798bd-8799-4206-bf0c-b2f62f1fc1d0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.621902 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7c15976-1c83-43b6-8077-6af8ecc010dc" (UID: "d7c15976-1c83-43b6-8077-6af8ecc010dc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.631079 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" (UID: "eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.698557 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/463798bd-8799-4206-bf0c-b2f62f1fc1d0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.698606 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.698620 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7c15976-1c83-43b6-8077-6af8ecc010dc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.698737 4990 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.698798 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:06:03.698776552 +0000 UTC m=+1345.495408409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-config" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.699224 4990 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.699262 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:06:03.699254484 +0000 UTC m=+1345.495886341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-httpd-config" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.699295 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.699313 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:06:03.699307175 +0000 UTC m=+1345.495939032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-scripts" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.699338 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.699356 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:06:03.699350806 +0000 UTC m=+1345.495982663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-config" not found Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.781010 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.785685 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.786497 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.786550 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.805712 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.807132 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.807844 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.812550 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:05:59 crc kubenswrapper[4990]: E1003 10:05:59.812611 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.865840 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5fb98f794c-zgdcx" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.168:8080/healthcheck\": dial tcp 10.217.0.168:8080: connect: connection refused" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.868879 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5fb98f794c-zgdcx" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.168:8080/healthcheck\": dial tcp 10.217.0.168:8080: connect: connection refused" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.876086 4990 scope.go:117] "RemoveContainer" containerID="80d90ac04b21b6c1e6e2c272077df839f5fc0b7edc4dff8dce45383e46f9d909" Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.883929 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-68zd7"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.894467 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-68zd7"] Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.905764 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-vencrypt-tls-certs\") pod \"d5b89027-cf5d-4807-adc3-b4915304f1f2\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.905903 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-config-data\") pod \"d5b89027-cf5d-4807-adc3-b4915304f1f2\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.907464 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzm55\" (UniqueName: \"kubernetes.io/projected/d5b89027-cf5d-4807-adc3-b4915304f1f2-kube-api-access-hzm55\") pod \"d5b89027-cf5d-4807-adc3-b4915304f1f2\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.907577 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-combined-ca-bundle\") pod \"d5b89027-cf5d-4807-adc3-b4915304f1f2\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.907621 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-nova-novncproxy-tls-certs\") pod \"d5b89027-cf5d-4807-adc3-b4915304f1f2\" (UID: \"d5b89027-cf5d-4807-adc3-b4915304f1f2\") " Oct 03 10:05:59 crc kubenswrapper[4990]: I1003 10:05:59.921021 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b89027-cf5d-4807-adc3-b4915304f1f2-kube-api-access-hzm55" (OuterVolumeSpecName: "kube-api-access-hzm55") pod "d5b89027-cf5d-4807-adc3-b4915304f1f2" (UID: "d5b89027-cf5d-4807-adc3-b4915304f1f2"). InnerVolumeSpecName "kube-api-access-hzm55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.010352 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzm55\" (UniqueName: \"kubernetes.io/projected/d5b89027-cf5d-4807-adc3-b4915304f1f2-kube-api-access-hzm55\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.076046 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "d5b89027-cf5d-4807-adc3-b4915304f1f2" (UID: "d5b89027-cf5d-4807-adc3-b4915304f1f2"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.077726 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5b89027-cf5d-4807-adc3-b4915304f1f2" (UID: "d5b89027-cf5d-4807-adc3-b4915304f1f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.079625 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-config-data" (OuterVolumeSpecName: "config-data") pod "d5b89027-cf5d-4807-adc3-b4915304f1f2" (UID: "d5b89027-cf5d-4807-adc3-b4915304f1f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.116391 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.116415 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.116426 4990 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.130161 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "d5b89027-cf5d-4807-adc3-b4915304f1f2" (UID: "d5b89027-cf5d-4807-adc3-b4915304f1f2"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.217853 4990 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5b89027-cf5d-4807-adc3-b4915304f1f2-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.351295 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5b89027-cf5d-4807-adc3-b4915304f1f2","Type":"ContainerDied","Data":"9b8f7e8d845cb735aace0c87a789a75c326074ab92ec2b02a341266267dc4d56"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.352645 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.361832 4990 generic.go:334] "Generic (PLEG): container finished" podID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerID="abcf8fd5281bc2aae312ef1aab5f533fd6ed7860e53c537ab6d669fe5cb383bc" exitCode=0 Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.361866 4990 generic.go:334] "Generic (PLEG): container finished" podID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerID="fbd93aafb30841d9e18e1ff1f7a28d4ee830685ba26f421b05af7ea452b37f7c" exitCode=0 Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.361867 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb98f794c-zgdcx" event={"ID":"41126aba-eb3b-4f29-89ab-29a3ea1addd9","Type":"ContainerDied","Data":"abcf8fd5281bc2aae312ef1aab5f533fd6ed7860e53c537ab6d669fe5cb383bc"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.361904 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb98f794c-zgdcx" event={"ID":"41126aba-eb3b-4f29-89ab-29a3ea1addd9","Type":"ContainerDied","Data":"fbd93aafb30841d9e18e1ff1f7a28d4ee830685ba26f421b05af7ea452b37f7c"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.385210 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.390641 4990 generic.go:334] "Generic (PLEG): container finished" podID="6fad2c33-3d08-4ab8-91b2-dea27b8dc05c" containerID="a611c2844768c9fce267e550748b6ceef23913a3939ed413aa9ad289209ac3ad" exitCode=0 Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.390728 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance5d27-account-delete-qb6r6" event={"ID":"6fad2c33-3d08-4ab8-91b2-dea27b8dc05c","Type":"ContainerDied","Data":"a611c2844768c9fce267e550748b6ceef23913a3939ed413aa9ad289209ac3ad"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.393285 4990 scope.go:117] "RemoveContainer" containerID="7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.400475 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.405963 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.406081 4990 generic.go:334] "Generic (PLEG): container finished" podID="825c3741-d390-4a7c-b3a6-50e268fbe712" containerID="25ee2439d77389480d2aaac4db1aff11cac3ac7c83e931af5125b4a6a4560cab" exitCode=0 Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.406132 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderdff5-account-delete-qtjtl" event={"ID":"825c3741-d390-4a7c-b3a6-50e268fbe712","Type":"ContainerDied","Data":"25ee2439d77389480d2aaac4db1aff11cac3ac7c83e931af5125b4a6a4560cab"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.406155 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderdff5-account-delete-qtjtl" event={"ID":"825c3741-d390-4a7c-b3a6-50e268fbe712","Type":"ContainerStarted","Data":"833fc39f95171c3d396de0bc83fc59cc8589bdce816253e4c847ebbc0cb3c85a"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.425014 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-766c7d6945-qr5ht"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.431144 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-766c7d6945-qr5ht"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.450766 4990 generic.go:334] "Generic (PLEG): container finished" podID="c64d7262-ab55-4d88-bb9c-02825e07721a" containerID="0bc80b0179018a6f6162e2e984236ff0b86684b8b755febd2da28a80c9bd0466" exitCode=0 Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.450816 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7b28-account-delete-bg9sn" event={"ID":"c64d7262-ab55-4d88-bb9c-02825e07721a","Type":"ContainerDied","Data":"0bc80b0179018a6f6162e2e984236ff0b86684b8b755febd2da28a80c9bd0466"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.450839 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7b28-account-delete-bg9sn" event={"ID":"c64d7262-ab55-4d88-bb9c-02825e07721a","Type":"ContainerStarted","Data":"f65da99de0e0433908e8bb864cba35e548a42afa842556d9a02e62fc7b8a58e9"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.460471 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.460560 4990 generic.go:334] "Generic (PLEG): container finished" podID="16a22247-2803-4910-a44a-9ccba673c2cf" containerID="daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22" exitCode=0 Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.460856 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16a22247-2803-4910-a44a-9ccba673c2cf","Type":"ContainerDied","Data":"daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.460911 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16a22247-2803-4910-a44a-9ccba673c2cf","Type":"ContainerDied","Data":"6714afbbfaa0582c607dde4a23d8cd684e969ee6fbc2cfc7c5e1f572624c145d"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.472862 4990 generic.go:334] "Generic (PLEG): container finished" podID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerID="eca91cab1b64d609b0395ae540c0b555039475f0cf2f921543fe267f50e40ea2" exitCode=0 Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.472977 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f897c554-pws5p" event={"ID":"5f14bb4e-f980-48fb-bba4-c068419b1975","Type":"ContainerDied","Data":"eca91cab1b64d609b0395ae540c0b555039475f0cf2f921543fe267f50e40ea2"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.476305 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.494330 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.520939 4990 generic.go:334] "Generic (PLEG): container finished" podID="5d8359ce-901e-400a-926a-b3060c2dc789" containerID="00f4875ec8272b3551b58fd88f4ed38c7b63402186dec2ee6b42c19ede373391" exitCode=0 Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.521002 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell05001-account-delete-5nncz" event={"ID":"5d8359ce-901e-400a-926a-b3060c2dc789","Type":"ContainerDied","Data":"00f4875ec8272b3551b58fd88f4ed38c7b63402186dec2ee6b42c19ede373391"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.521030 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell05001-account-delete-5nncz" event={"ID":"5d8359ce-901e-400a-926a-b3060c2dc789","Type":"ContainerStarted","Data":"61ea26503903cc9c2897b4189fd6a57cd0effd9de7477bdd9ecfe4893ef4075e"} Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534002 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-default\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534387 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-combined-ca-bundle\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534409 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzwpj\" (UniqueName: \"kubernetes.io/projected/16a22247-2803-4910-a44a-9ccba673c2cf-kube-api-access-gzwpj\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534453 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-operator-scripts\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534475 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-secrets\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534500 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534535 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-generated\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534582 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-galera-tls-certs\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.534609 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-kolla-config\") pod \"16a22247-2803-4910-a44a-9ccba673c2cf\" (UID: \"16a22247-2803-4910-a44a-9ccba673c2cf\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.540570 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.540889 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.541451 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.541467 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.547957 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-secrets" (OuterVolumeSpecName: "secrets") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.550364 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a22247-2803-4910-a44a-9ccba673c2cf-kube-api-access-gzwpj" (OuterVolumeSpecName: "kube-api-access-gzwpj") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "kube-api-access-gzwpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.597274 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.622724 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.642200 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.642230 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzwpj\" (UniqueName: \"kubernetes.io/projected/16a22247-2803-4910-a44a-9ccba673c2cf-kube-api-access-gzwpj\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.642240 4990 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.642248 4990 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.642268 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.642278 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.642286 4990 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.642293 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16a22247-2803-4910-a44a-9ccba673c2cf-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.647328 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "16a22247-2803-4910-a44a-9ccba673c2cf" (UID: "16a22247-2803-4910-a44a-9ccba673c2cf"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.660295 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59f897c554-pws5p" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.720946 4990 scope.go:117] "RemoveContainer" containerID="7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.722798 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.731290 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:06:00 crc kubenswrapper[4990]: E1003 10:06:00.739368 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4\": container with ID starting with 7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4 not found: ID does not exist" containerID="7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.739414 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4"} err="failed to get container status \"7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4\": rpc error: code = NotFound desc = could not find container \"7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4\": container with ID starting with 7e89082666492745a48ddbc5958054f71d2a585b0975708a8916d76de5a1e9f4 not found: ID does not exist" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.739448 4990 scope.go:117] "RemoveContainer" containerID="61366d13516cfa8aeb4e891ab816effe7f676485e8ed9d36f65358ce050b6251" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.740290 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fblmx"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.743379 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpwsf\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-kube-api-access-jpwsf\") pod \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.743706 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-run-httpd\") pod \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.743885 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-scripts\") pod \"5f14bb4e-f980-48fb-bba4-c068419b1975\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.744014 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-internal-tls-certs\") pod \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.744159 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-config-data\") pod \"5f14bb4e-f980-48fb-bba4-c068419b1975\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.755727 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-internal-tls-certs\") pod \"5f14bb4e-f980-48fb-bba4-c068419b1975\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.755855 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-etc-swift\") pod \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.747545 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-kube-api-access-jpwsf" (OuterVolumeSpecName: "kube-api-access-jpwsf") pod "41126aba-eb3b-4f29-89ab-29a3ea1addd9" (UID: "41126aba-eb3b-4f29-89ab-29a3ea1addd9"). InnerVolumeSpecName "kube-api-access-jpwsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.747794 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41126aba-eb3b-4f29-89ab-29a3ea1addd9" (UID: "41126aba-eb3b-4f29-89ab-29a3ea1addd9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.755286 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-scripts" (OuterVolumeSpecName: "scripts") pod "5f14bb4e-f980-48fb-bba4-c068419b1975" (UID: "5f14bb4e-f980-48fb-bba4-c068419b1975"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.756047 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-combined-ca-bundle\") pod \"5f14bb4e-f980-48fb-bba4-c068419b1975\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.756186 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-public-tls-certs\") pod \"5f14bb4e-f980-48fb-bba4-c068419b1975\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.756248 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-public-tls-certs\") pod \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.756340 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-config-data\") pod \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.756406 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-log-httpd\") pod \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.756497 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-combined-ca-bundle\") pod \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\" (UID: \"41126aba-eb3b-4f29-89ab-29a3ea1addd9\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.756592 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f14bb4e-f980-48fb-bba4-c068419b1975-logs\") pod \"5f14bb4e-f980-48fb-bba4-c068419b1975\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.757568 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qc64\" (UniqueName: \"kubernetes.io/projected/5f14bb4e-f980-48fb-bba4-c068419b1975-kube-api-access-4qc64\") pod \"5f14bb4e-f980-48fb-bba4-c068419b1975\" (UID: \"5f14bb4e-f980-48fb-bba4-c068419b1975\") " Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.768381 4990 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16a22247-2803-4910-a44a-9ccba673c2cf-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.768410 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.769046 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41126aba-eb3b-4f29-89ab-29a3ea1addd9" (UID: "41126aba-eb3b-4f29-89ab-29a3ea1addd9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.772804 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f14bb4e-f980-48fb-bba4-c068419b1975-kube-api-access-4qc64" (OuterVolumeSpecName: "kube-api-access-4qc64") pod "5f14bb4e-f980-48fb-bba4-c068419b1975" (UID: "5f14bb4e-f980-48fb-bba4-c068419b1975"). InnerVolumeSpecName "kube-api-access-4qc64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.788705 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "41126aba-eb3b-4f29-89ab-29a3ea1addd9" (UID: "41126aba-eb3b-4f29-89ab-29a3ea1addd9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.810464 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f14bb4e-f980-48fb-bba4-c068419b1975-logs" (OuterVolumeSpecName: "logs") pod "5f14bb4e-f980-48fb-bba4-c068419b1975" (UID: "5f14bb4e-f980-48fb-bba4-c068419b1975"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.822964 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fblmx"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.827524 4990 scope.go:117] "RemoveContainer" containerID="5d577a29b7ee9a04a2b18df67e6f481c57ad2dbffb24a317b5c6b4aaad21f535" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.867349 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b28-account-create-cpqx4"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.906221 4990 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.906248 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.906258 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f14bb4e-f980-48fb-bba4-c068419b1975-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.906268 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qc64\" (UniqueName: \"kubernetes.io/projected/5f14bb4e-f980-48fb-bba4-c068419b1975-kube-api-access-4qc64\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.906278 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpwsf\" (UniqueName: \"kubernetes.io/projected/41126aba-eb3b-4f29-89ab-29a3ea1addd9-kube-api-access-jpwsf\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.906288 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41126aba-eb3b-4f29-89ab-29a3ea1addd9-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.906296 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.924784 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463798bd-8799-4206-bf0c-b2f62f1fc1d0" path="/var/lib/kubelet/pods/463798bd-8799-4206-bf0c-b2f62f1fc1d0/volumes" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.925617 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c4e608-45c9-447a-802d-dc405aac76e4" path="/var/lib/kubelet/pods/72c4e608-45c9-447a-802d-dc405aac76e4/volumes" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.926264 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6db8725-5245-49dc-8bbc-9b8741622c42" path="/var/lib/kubelet/pods/a6db8725-5245-49dc-8bbc-9b8741622c42/volumes" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.926811 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b89027-cf5d-4807-adc3-b4915304f1f2" path="/var/lib/kubelet/pods/d5b89027-cf5d-4807-adc3-b4915304f1f2/volumes" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.935833 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c15976-1c83-43b6-8077-6af8ecc010dc" path="/var/lib/kubelet/pods/d7c15976-1c83-43b6-8077-6af8ecc010dc/volumes" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.943728 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1b0e32-43df-49be-944a-ad51d76dbf32" path="/var/lib/kubelet/pods/dc1b0e32-43df-49be-944a-ad51d76dbf32/volumes" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.944583 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" path="/var/lib/kubelet/pods/eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9/volumes" Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.979098 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron7b28-account-delete-bg9sn"] Oct 03 10:06:00 crc kubenswrapper[4990]: I1003 10:06:00.979150 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7b28-account-create-cpqx4"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.004587 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "41126aba-eb3b-4f29-89ab-29a3ea1addd9" (UID: "41126aba-eb3b-4f29-89ab-29a3ea1addd9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.008142 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.080367 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-config-data" (OuterVolumeSpecName: "config-data") pod "5f14bb4e-f980-48fb-bba4-c068419b1975" (UID: "5f14bb4e-f980-48fb-bba4-c068419b1975"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.095898 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.110439 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.137714 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.167076 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-config-data" (OuterVolumeSpecName: "config-data") pod "41126aba-eb3b-4f29-89ab-29a3ea1addd9" (UID: "41126aba-eb3b-4f29-89ab-29a3ea1addd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.189849 4990 scope.go:117] "RemoveContainer" containerID="daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.195211 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.195525 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="ceilometer-central-agent" containerID="cri-o://0cf5ac29746ce882d6ae1c7168250fbf34eb77ced233198b98d8e39f0ab37bd4" gracePeriod=30 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.195784 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="proxy-httpd" containerID="cri-o://9e36ab3093e7df92e341c1eb14c639feff278ab0a3155acc7a3df5e7970a5bb6" gracePeriod=30 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.195909 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="sg-core" containerID="cri-o://77511c0f1dafbe0f6a7ce5a4f15e45796b6eadf3587bf8ee1730bb9c4a726c6d" gracePeriod=30 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.195934 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="ceilometer-notification-agent" containerID="cri-o://f726157640cde355a6b4fde9ac87cd11f712f5e45c77f82242ffce8ed67bd078" gracePeriod=30 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.200336 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderdff5-account-delete-qtjtl" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.214998 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.217174 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance5d27-account-delete-qb6r6" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.218865 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.219091 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="56fcd909-29b5-472d-8007-84fc511ac818" containerName="kube-state-metrics" containerID="cri-o://31042aa385dfd4cdb15ed74ad929e7230170e4b65fe953249dedab85ce6fb90d" gracePeriod=30 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.231454 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "41126aba-eb3b-4f29-89ab-29a3ea1addd9" (UID: "41126aba-eb3b-4f29-89ab-29a3ea1addd9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.234070 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.234272 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="230b4581-35e6-4c97-9f63-73e70624bf5c" containerName="memcached" containerID="cri-o://8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf" gracePeriod=30 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.247584 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-lh9qf"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.255205 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-lh9qf"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.278337 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5001-account-create-zstjs"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.284951 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5001-account-create-zstjs"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.292191 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell05001-account-delete-5nncz"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.298098 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="46e13f097dbadede2c3cc71a0e4ec9fe6d9a4c7164ffcc20da40533122fd3bc4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.299265 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="46e13f097dbadede2c3cc71a0e4ec9fe6d9a4c7164ffcc20da40533122fd3bc4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.300404 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="46e13f097dbadede2c3cc71a0e4ec9fe6d9a4c7164ffcc20da40533122fd3bc4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.300447 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="ac157dc7-6df6-4f4f-ba65-c85b58f78fff" containerName="nova-cell1-conductor-conductor" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.302794 4990 scope.go:117] "RemoveContainer" containerID="78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.308999 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vc47v"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.309369 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7871e236e74cb6ae1f5cad66ad4b89c2125e25150e40edb22d50c75bed041cb2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.315308 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f14bb4e-f980-48fb-bba4-c068419b1975" (UID: "5f14bb4e-f980-48fb-bba4-c068419b1975"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.315922 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vz76\" (UniqueName: \"kubernetes.io/projected/825c3741-d390-4a7c-b3a6-50e268fbe712-kube-api-access-6vz76\") pod \"825c3741-d390-4a7c-b3a6-50e268fbe712\" (UID: \"825c3741-d390-4a7c-b3a6-50e268fbe712\") " Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.315997 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vktcj\" (UniqueName: \"kubernetes.io/projected/6fad2c33-3d08-4ab8-91b2-dea27b8dc05c-kube-api-access-vktcj\") pod \"6fad2c33-3d08-4ab8-91b2-dea27b8dc05c\" (UID: \"6fad2c33-3d08-4ab8-91b2-dea27b8dc05c\") " Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.316610 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.316624 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.316883 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bnq5z"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.330905 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7871e236e74cb6ae1f5cad66ad4b89c2125e25150e40edb22d50c75bed041cb2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.338310 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7871e236e74cb6ae1f5cad66ad4b89c2125e25150e40edb22d50c75bed041cb2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.338382 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="ovn-northd" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.340165 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7ff64b77bd-5qpwf" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:49764->10.217.0.158:9311: read: connection reset by peer" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.340281 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7ff64b77bd-5qpwf" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:49762->10.217.0.158:9311: read: connection reset by peer" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.345282 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bnq5z"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.355033 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/825c3741-d390-4a7c-b3a6-50e268fbe712-kube-api-access-6vz76" (OuterVolumeSpecName: "kube-api-access-6vz76") pod "825c3741-d390-4a7c-b3a6-50e268fbe712" (UID: "825c3741-d390-4a7c-b3a6-50e268fbe712"). InnerVolumeSpecName "kube-api-access-6vz76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.357639 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fad2c33-3d08-4ab8-91b2-dea27b8dc05c-kube-api-access-vktcj" (OuterVolumeSpecName: "kube-api-access-vktcj") pod "6fad2c33-3d08-4ab8-91b2-dea27b8dc05c" (UID: "6fad2c33-3d08-4ab8-91b2-dea27b8dc05c"). InnerVolumeSpecName "kube-api-access-vktcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.367324 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": dial tcp 10.217.0.166:8776: connect: connection refused" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.386663 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vc47v"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.386940 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41126aba-eb3b-4f29-89ab-29a3ea1addd9" (UID: "41126aba-eb3b-4f29-89ab-29a3ea1addd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.404443 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.418058 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vz76\" (UniqueName: \"kubernetes.io/projected/825c3741-d390-4a7c-b3a6-50e268fbe712-kube-api-access-6vz76\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.418093 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vktcj\" (UniqueName: \"kubernetes.io/projected/6fad2c33-3d08-4ab8-91b2-dea27b8dc05c-kube-api-access-vktcj\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.418157 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41126aba-eb3b-4f29-89ab-29a3ea1addd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.423227 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6897c54f48-kp6tm"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.423559 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6897c54f48-kp6tm" podUID="b2d5088c-5854-4bee-9e3c-8198d4b7d377" containerName="keystone-api" containerID="cri-o://f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835" gracePeriod=30 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.437579 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-88xpx"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.444852 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-88xpx"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.455865 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1c53-account-create-j26hk"] Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.464708 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1c53-account-create-j26hk"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.475433 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf251a942_6e8b_4f2e_a6e8_b505e4921b19.slice/crio-conmon-f7d3bf56d474b40153f72d5bc27483a5ac6dcc6d775e56fb75b8e6acd314a813.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf251a942_6e8b_4f2e_a6e8_b505e4921b19.slice/crio-f7d3bf56d474b40153f72d5bc27483a5ac6dcc6d775e56fb75b8e6acd314a813.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f3b6bcb_ae6c_47a2_aa97_46a21b0804c5.slice/crio-conmon-77511c0f1dafbe0f6a7ce5a4f15e45796b6eadf3587bf8ee1730bb9c4a726c6d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f3b6bcb_ae6c_47a2_aa97_46a21b0804c5.slice/crio-77511c0f1dafbe0f6a7ce5a4f15e45796b6eadf3587bf8ee1730bb9c4a726c6d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafea80e6_894d_41cd_b107_926d012e9f35.slice/crio-88d641d1bf486138095cda080880426b764b3efa9d3577891d4ca40c95c3393d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbafd4ca6_6b2d_4c8e_b285_e5b29d2f4507.slice/crio-e50c76114e970456369b10767db9ea929de83f5f9f49ec0e5b445f383d8fe4e0.scope\": RecentStats: unable to find data in memory cache]" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.548685 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5f14bb4e-f980-48fb-bba4-c068419b1975" (UID: "5f14bb4e-f980-48fb-bba4-c068419b1975"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.571844 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5f14bb4e-f980-48fb-bba4-c068419b1975" (UID: "5f14bb4e-f980-48fb-bba4-c068419b1975"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.575862 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance5d27-account-delete-qb6r6" event={"ID":"6fad2c33-3d08-4ab8-91b2-dea27b8dc05c","Type":"ContainerDied","Data":"cadd33a72d00696b6fb2ac6058a0851b9770eb2e7bfc0b4a8c3ab84c6b2d4c31"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.575953 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance5d27-account-delete-qb6r6" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.592768 4990 generic.go:334] "Generic (PLEG): container finished" podID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerID="5af4d58eb46b303caa40337b751ef9459fc70d7f1edca5370cef7c7ece2d18ae" exitCode=0 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.592862 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b682e49-2ca7-4692-b989-28dfbd26163e","Type":"ContainerDied","Data":"5af4d58eb46b303caa40337b751ef9459fc70d7f1edca5370cef7c7ece2d18ae"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.602525 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderdff5-account-delete-qtjtl" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.603600 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderdff5-account-delete-qtjtl" event={"ID":"825c3741-d390-4a7c-b3a6-50e268fbe712","Type":"ContainerDied","Data":"833fc39f95171c3d396de0bc83fc59cc8589bdce816253e4c847ebbc0cb3c85a"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.622658 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.622689 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f14bb4e-f980-48fb-bba4-c068419b1975-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.629463 4990 generic.go:334] "Generic (PLEG): container finished" podID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerID="e50c76114e970456369b10767db9ea929de83f5f9f49ec0e5b445f383d8fe4e0" exitCode=0 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.629597 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507","Type":"ContainerDied","Data":"e50c76114e970456369b10767db9ea929de83f5f9f49ec0e5b445f383d8fe4e0"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.702617 4990 generic.go:334] "Generic (PLEG): container finished" podID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerID="9e36ab3093e7df92e341c1eb14c639feff278ab0a3155acc7a3df5e7970a5bb6" exitCode=0 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.702659 4990 generic.go:334] "Generic (PLEG): container finished" podID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerID="77511c0f1dafbe0f6a7ce5a4f15e45796b6eadf3587bf8ee1730bb9c4a726c6d" exitCode=2 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.702712 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerDied","Data":"9e36ab3093e7df92e341c1eb14c639feff278ab0a3155acc7a3df5e7970a5bb6"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.702741 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerDied","Data":"77511c0f1dafbe0f6a7ce5a4f15e45796b6eadf3587bf8ee1730bb9c4a726c6d"} Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.735787 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6adc7ab75463d0f987dece3ce8e23ed148319e34a206c17402e0f133ac4e50a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.746663 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6adc7ab75463d0f987dece3ce8e23ed148319e34a206c17402e0f133ac4e50a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.767031 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6adc7ab75463d0f987dece3ce8e23ed148319e34a206c17402e0f133ac4e50a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 10:06:01 crc kubenswrapper[4990]: E1003 10:06:01.767100 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" containerName="nova-scheduler-scheduler" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.768131 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59f897c554-pws5p" event={"ID":"5f14bb4e-f980-48fb-bba4-c068419b1975","Type":"ContainerDied","Data":"9a8c1308c67cc4e4f4b4a8993181d9f75055aafa0aa3a8119196155c514815ba"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.768228 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59f897c554-pws5p" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.818302 4990 generic.go:334] "Generic (PLEG): container finished" podID="56fcd909-29b5-472d-8007-84fc511ac818" containerID="31042aa385dfd4cdb15ed74ad929e7230170e4b65fe953249dedab85ce6fb90d" exitCode=2 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.818363 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56fcd909-29b5-472d-8007-84fc511ac818","Type":"ContainerDied","Data":"31042aa385dfd4cdb15ed74ad929e7230170e4b65fe953249dedab85ce6fb90d"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.825643 4990 generic.go:334] "Generic (PLEG): container finished" podID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerID="2f9c4a7944eae6a958448c1d084cbafac9f94669c8762fb52bb88d1d7c1f256d" exitCode=0 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.825714 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff64b77bd-5qpwf" event={"ID":"ebb4021b-c9ef-4b31-864d-d4874b51e47c","Type":"ContainerDied","Data":"2f9c4a7944eae6a958448c1d084cbafac9f94669c8762fb52bb88d1d7c1f256d"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.834351 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" containerName="galera" containerID="cri-o://72084d24cc256a164d380470d3a517d6d49179f56ce62666893f00f00964d3bf" gracePeriod=30 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.862407 4990 generic.go:334] "Generic (PLEG): container finished" podID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerID="de83fa39f0019c8400c38ca9ffa8825e384c428252e8724d9b3c68ce695b214d" exitCode=0 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.862590 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971e9963-b7ee-4ee8-872a-2f696bbfdb40","Type":"ContainerDied","Data":"de83fa39f0019c8400c38ca9ffa8825e384c428252e8724d9b3c68ce695b214d"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.875381 4990 generic.go:334] "Generic (PLEG): container finished" podID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerID="f7d3bf56d474b40153f72d5bc27483a5ac6dcc6d775e56fb75b8e6acd314a813" exitCode=0 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.875460 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f251a942-6e8b-4f2e-a6e8-b505e4921b19","Type":"ContainerDied","Data":"f7d3bf56d474b40153f72d5bc27483a5ac6dcc6d775e56fb75b8e6acd314a813"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.895580 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5fb98f794c-zgdcx" event={"ID":"41126aba-eb3b-4f29-89ab-29a3ea1addd9","Type":"ContainerDied","Data":"430e8f0f9b1648a681ddd8b3527681286bb674db23ebbfc03d262e3c6cfbdb61"} Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.896225 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5fb98f794c-zgdcx" Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.898330 4990 generic.go:334] "Generic (PLEG): container finished" podID="afea80e6-894d-41cd-b107-926d012e9f35" containerID="88d641d1bf486138095cda080880426b764b3efa9d3577891d4ca40c95c3393d" exitCode=0 Oct 03 10:06:01 crc kubenswrapper[4990]: I1003 10:06:01.898546 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afea80e6-894d-41cd-b107-926d012e9f35","Type":"ContainerDied","Data":"88d641d1bf486138095cda080880426b764b3efa9d3577891d4ca40c95c3393d"} Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.086280 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.116463 4990 scope.go:117] "RemoveContainer" containerID="daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22" Oct 03 10:06:02 crc kubenswrapper[4990]: E1003 10:06:02.138420 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22\": container with ID starting with daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22 not found: ID does not exist" containerID="daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.138462 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22"} err="failed to get container status \"daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22\": rpc error: code = NotFound desc = could not find container \"daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22\": container with ID starting with daeb8962dd120cbdfe55d692d73fd3be8b796148a41d83b6c69b9ac106c3aa22 not found: ID does not exist" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.138484 4990 scope.go:117] "RemoveContainer" containerID="78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2" Oct 03 10:06:02 crc kubenswrapper[4990]: E1003 10:06:02.146968 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2\": container with ID starting with 78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2 not found: ID does not exist" containerID="78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.147006 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2"} err="failed to get container status \"78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2\": rpc error: code = NotFound desc = could not find container \"78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2\": container with ID starting with 78b8424b0085f38817a30d9f0d3bf6f34c592fcce13b95b754e86038366f6cf2 not found: ID does not exist" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.147031 4990 scope.go:117] "RemoveContainer" containerID="a611c2844768c9fce267e550748b6ceef23913a3939ed413aa9ad289209ac3ad" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.147875 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-config-data\") pod \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.147924 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-scripts\") pod \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.147965 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-logs\") pod \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.148006 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-internal-tls-certs\") pod \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.148053 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-httpd-run\") pod \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.148084 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-combined-ca-bundle\") pod \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.148127 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vzts\" (UniqueName: \"kubernetes.io/projected/971e9963-b7ee-4ee8-872a-2f696bbfdb40-kube-api-access-7vzts\") pod \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.148269 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\" (UID: \"971e9963-b7ee-4ee8-872a-2f696bbfdb40\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.151195 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "971e9963-b7ee-4ee8-872a-2f696bbfdb40" (UID: "971e9963-b7ee-4ee8-872a-2f696bbfdb40"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.154594 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-scripts" (OuterVolumeSpecName: "scripts") pod "971e9963-b7ee-4ee8-872a-2f696bbfdb40" (UID: "971e9963-b7ee-4ee8-872a-2f696bbfdb40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.155022 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-logs" (OuterVolumeSpecName: "logs") pod "971e9963-b7ee-4ee8-872a-2f696bbfdb40" (UID: "971e9963-b7ee-4ee8-872a-2f696bbfdb40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.157020 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.165399 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971e9963-b7ee-4ee8-872a-2f696bbfdb40-kube-api-access-7vzts" (OuterVolumeSpecName: "kube-api-access-7vzts") pod "971e9963-b7ee-4ee8-872a-2f696bbfdb40" (UID: "971e9963-b7ee-4ee8-872a-2f696bbfdb40"). InnerVolumeSpecName "kube-api-access-7vzts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.165439 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "971e9963-b7ee-4ee8-872a-2f696bbfdb40" (UID: "971e9963-b7ee-4ee8-872a-2f696bbfdb40"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.194718 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5fb98f794c-zgdcx"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.204840 4990 scope.go:117] "RemoveContainer" containerID="25ee2439d77389480d2aaac4db1aff11cac3ac7c83e931af5125b4a6a4560cab" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.211593 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "971e9963-b7ee-4ee8-872a-2f696bbfdb40" (UID: "971e9963-b7ee-4ee8-872a-2f696bbfdb40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.246215 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5fb98f794c-zgdcx"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.250193 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-internal-tls-certs\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.250279 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-combined-ca-bundle\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.250664 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b682e49-2ca7-4692-b989-28dfbd26163e-etc-machine-id\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.250702 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.250780 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-public-tls-certs\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.250816 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrf9x\" (UniqueName: \"kubernetes.io/projected/7b682e49-2ca7-4692-b989-28dfbd26163e-kube-api-access-lrf9x\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.250881 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b682e49-2ca7-4692-b989-28dfbd26163e-logs\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.250978 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-scripts\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.251038 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data-custom\") pod \"7b682e49-2ca7-4692-b989-28dfbd26163e\" (UID: \"7b682e49-2ca7-4692-b989-28dfbd26163e\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.252394 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.252424 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.252437 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.252449 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/971e9963-b7ee-4ee8-872a-2f696bbfdb40-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.252462 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.252477 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vzts\" (UniqueName: \"kubernetes.io/projected/971e9963-b7ee-4ee8-872a-2f696bbfdb40-kube-api-access-7vzts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.253704 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b682e49-2ca7-4692-b989-28dfbd26163e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.255223 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.255835 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderdff5-account-delete-qtjtl"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.257710 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b682e49-2ca7-4692-b989-28dfbd26163e-logs" (OuterVolumeSpecName: "logs") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.262621 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.262658 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinderdff5-account-delete-qtjtl"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.266142 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b682e49-2ca7-4692-b989-28dfbd26163e-kube-api-access-lrf9x" (OuterVolumeSpecName: "kube-api-access-lrf9x") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "kube-api-access-lrf9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.273844 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance5d27-account-delete-qb6r6"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.276826 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-scripts" (OuterVolumeSpecName: "scripts") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.277499 4990 scope.go:117] "RemoveContainer" containerID="eca91cab1b64d609b0395ae540c0b555039475f0cf2f921543fe267f50e40ea2" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.299572 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance5d27-account-delete-qb6r6"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.306686 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-config-data" (OuterVolumeSpecName: "config-data") pod "971e9963-b7ee-4ee8-872a-2f696bbfdb40" (UID: "971e9963-b7ee-4ee8-872a-2f696bbfdb40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.311763 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.319732 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "971e9963-b7ee-4ee8-872a-2f696bbfdb40" (UID: "971e9963-b7ee-4ee8-872a-2f696bbfdb40"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.322639 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-59f897c554-pws5p"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.332300 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-59f897c554-pws5p"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.335113 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.336177 4990 scope.go:117] "RemoveContainer" containerID="71ede97e001d771329948c2371baea1801f48fce90e28d8bf76d142a1956d199" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.346889 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.355828 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data-custom\") pod \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.355876 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-config\") pod \"56fcd909-29b5-472d-8007-84fc511ac818\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.355917 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-internal-tls-certs\") pod \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.355987 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-combined-ca-bundle\") pod \"56fcd909-29b5-472d-8007-84fc511ac818\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.356563 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4gq4\" (UniqueName: \"kubernetes.io/projected/f251a942-6e8b-4f2e-a6e8-b505e4921b19-kube-api-access-j4gq4\") pod \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.356612 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebb4021b-c9ef-4b31-864d-d4874b51e47c-logs\") pod \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.356630 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-public-tls-certs\") pod \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.356833 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-internal-tls-certs\") pod \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.356870 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-public-tls-certs\") pod \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.356893 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-combined-ca-bundle\") pod \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.358837 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjj6\" (UniqueName: \"kubernetes.io/projected/ebb4021b-c9ef-4b31-864d-d4874b51e47c-kube-api-access-lxjj6\") pod \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.358865 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz6pz\" (UniqueName: \"kubernetes.io/projected/56fcd909-29b5-472d-8007-84fc511ac818-kube-api-access-nz6pz\") pod \"56fcd909-29b5-472d-8007-84fc511ac818\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.358883 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251a942-6e8b-4f2e-a6e8-b505e4921b19-logs\") pod \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.358927 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-certs\") pod \"56fcd909-29b5-472d-8007-84fc511ac818\" (UID: \"56fcd909-29b5-472d-8007-84fc511ac818\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.358945 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-config-data\") pod \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359003 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data\") pod \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\" (UID: \"ebb4021b-c9ef-4b31-864d-d4874b51e47c\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359085 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-combined-ca-bundle\") pod \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\" (UID: \"f251a942-6e8b-4f2e-a6e8-b505e4921b19\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359709 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b682e49-2ca7-4692-b989-28dfbd26163e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359722 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359733 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359744 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359753 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359762 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b682e49-2ca7-4692-b989-28dfbd26163e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359772 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/971e9963-b7ee-4ee8-872a-2f696bbfdb40-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.359982 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrf9x\" (UniqueName: \"kubernetes.io/projected/7b682e49-2ca7-4692-b989-28dfbd26163e-kube-api-access-lrf9x\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.361853 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb4021b-c9ef-4b31-864d-d4874b51e47c-logs" (OuterVolumeSpecName: "logs") pod "ebb4021b-c9ef-4b31-864d-d4874b51e47c" (UID: "ebb4021b-c9ef-4b31-864d-d4874b51e47c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.364956 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.366104 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f251a942-6e8b-4f2e-a6e8-b505e4921b19-logs" (OuterVolumeSpecName: "logs") pod "f251a942-6e8b-4f2e-a6e8-b505e4921b19" (UID: "f251a942-6e8b-4f2e-a6e8-b505e4921b19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.367945 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f251a942-6e8b-4f2e-a6e8-b505e4921b19-kube-api-access-j4gq4" (OuterVolumeSpecName: "kube-api-access-j4gq4") pod "f251a942-6e8b-4f2e-a6e8-b505e4921b19" (UID: "f251a942-6e8b-4f2e-a6e8-b505e4921b19"). InnerVolumeSpecName "kube-api-access-j4gq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.374906 4990 scope.go:117] "RemoveContainer" containerID="abcf8fd5281bc2aae312ef1aab5f533fd6ed7860e53c537ab6d669fe5cb383bc" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.391016 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ebb4021b-c9ef-4b31-864d-d4874b51e47c" (UID: "ebb4021b-c9ef-4b31-864d-d4874b51e47c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.391184 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb4021b-c9ef-4b31-864d-d4874b51e47c-kube-api-access-lxjj6" (OuterVolumeSpecName: "kube-api-access-lxjj6") pod "ebb4021b-c9ef-4b31-864d-d4874b51e47c" (UID: "ebb4021b-c9ef-4b31-864d-d4874b51e47c"). InnerVolumeSpecName "kube-api-access-lxjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.406756 4990 scope.go:117] "RemoveContainer" containerID="fbd93aafb30841d9e18e1ff1f7a28d4ee830685ba26f421b05af7ea452b37f7c" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.406818 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56fcd909-29b5-472d-8007-84fc511ac818-kube-api-access-nz6pz" (OuterVolumeSpecName: "kube-api-access-nz6pz") pod "56fcd909-29b5-472d-8007-84fc511ac818" (UID: "56fcd909-29b5-472d-8007-84fc511ac818"). InnerVolumeSpecName "kube-api-access-nz6pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.451555 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56fcd909-29b5-472d-8007-84fc511ac818" (UID: "56fcd909-29b5-472d-8007-84fc511ac818"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472018 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "56fcd909-29b5-472d-8007-84fc511ac818" (UID: "56fcd909-29b5-472d-8007-84fc511ac818"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472707 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxjj6\" (UniqueName: \"kubernetes.io/projected/ebb4021b-c9ef-4b31-864d-d4874b51e47c-kube-api-access-lxjj6\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472725 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz6pz\" (UniqueName: \"kubernetes.io/projected/56fcd909-29b5-472d-8007-84fc511ac818-kube-api-access-nz6pz\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472739 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251a942-6e8b-4f2e-a6e8-b505e4921b19-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472753 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472766 4990 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472783 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472796 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4gq4\" (UniqueName: \"kubernetes.io/projected/f251a942-6e8b-4f2e-a6e8-b505e4921b19-kube-api-access-j4gq4\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472809 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebb4021b-c9ef-4b31-864d-d4874b51e47c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.472821 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.513659 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebb4021b-c9ef-4b31-864d-d4874b51e47c" (UID: "ebb4021b-c9ef-4b31-864d-d4874b51e47c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.573771 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data" (OuterVolumeSpecName: "config-data") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.574685 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.574708 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: E1003 10:06:02.588706 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.601052 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f251a942-6e8b-4f2e-a6e8-b505e4921b19" (UID: "f251a942-6e8b-4f2e-a6e8-b505e4921b19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: E1003 10:06:02.602593 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 10:06:02 crc kubenswrapper[4990]: E1003 10:06:02.604647 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 10:06:02 crc kubenswrapper[4990]: E1003 10:06:02.604726 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="85167add-e116-4e56-950b-fe0a6a553732" containerName="nova-cell0-conductor-conductor" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.649111 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-config-data" (OuterVolumeSpecName: "config-data") pod "f251a942-6e8b-4f2e-a6e8-b505e4921b19" (UID: "f251a942-6e8b-4f2e-a6e8-b505e4921b19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.649110 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.678489 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.678588 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.678598 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.680100 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7b682e49-2ca7-4692-b989-28dfbd26163e" (UID: "7b682e49-2ca7-4692-b989-28dfbd26163e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.698568 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data" (OuterVolumeSpecName: "config-data") pod "ebb4021b-c9ef-4b31-864d-d4874b51e47c" (UID: "ebb4021b-c9ef-4b31-864d-d4874b51e47c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.702764 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f251a942-6e8b-4f2e-a6e8-b505e4921b19" (UID: "f251a942-6e8b-4f2e-a6e8-b505e4921b19"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.712192 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "56fcd909-29b5-472d-8007-84fc511ac818" (UID: "56fcd909-29b5-472d-8007-84fc511ac818"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.712994 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f251a942-6e8b-4f2e-a6e8-b505e4921b19" (UID: "f251a942-6e8b-4f2e-a6e8-b505e4921b19"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.716406 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ebb4021b-c9ef-4b31-864d-d4874b51e47c" (UID: "ebb4021b-c9ef-4b31-864d-d4874b51e47c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.730452 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ebb4021b-c9ef-4b31-864d-d4874b51e47c" (UID: "ebb4021b-c9ef-4b31-864d-d4874b51e47c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.788347 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b682e49-2ca7-4692-b989-28dfbd26163e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.788752 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.788763 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.788772 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.788781 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251a942-6e8b-4f2e-a6e8-b505e4921b19-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.788791 4990 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/56fcd909-29b5-472d-8007-84fc511ac818-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.788802 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebb4021b-c9ef-4b31-864d-d4874b51e47c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.832201 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.852447 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell05001-account-delete-5nncz" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.877860 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7b28-account-delete-bg9sn" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.886635 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007a3204-42d2-4769-b267-c80963e1810e" path="/var/lib/kubelet/pods/007a3204-42d2-4769-b267-c80963e1810e/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.889859 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvm5b\" (UniqueName: \"kubernetes.io/projected/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-kube-api-access-xvm5b\") pod \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.890122 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-nova-metadata-tls-certs\") pod \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.890215 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-logs\") pod \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.891940 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a22247-2803-4910-a44a-9ccba673c2cf" path="/var/lib/kubelet/pods/16a22247-2803-4910-a44a-9ccba673c2cf/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.888367 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.892592 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-logs" (OuterVolumeSpecName: "logs") pod "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" (UID: "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.893874 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-config-data\") pod \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.896832 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-combined-ca-bundle\") pod \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\" (UID: \"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507\") " Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.897748 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-kube-api-access-xvm5b" (OuterVolumeSpecName: "kube-api-access-xvm5b") pod "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" (UID: "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507"). InnerVolumeSpecName "kube-api-access-xvm5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.898313 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b57ed7-8a33-445c-92fd-f95e0386fb1b" path="/var/lib/kubelet/pods/21b57ed7-8a33-445c-92fd-f95e0386fb1b/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.899337 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.899362 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvm5b\" (UniqueName: \"kubernetes.io/projected/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-kube-api-access-xvm5b\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.899972 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" path="/var/lib/kubelet/pods/41126aba-eb3b-4f29-89ab-29a3ea1addd9/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.907245 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b4be29-bab9-4e12-ada2-152f89f790bc" path="/var/lib/kubelet/pods/50b4be29-bab9-4e12-ada2-152f89f790bc/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.908228 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" path="/var/lib/kubelet/pods/5f14bb4e-f980-48fb-bba4-c068419b1975/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.908883 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fad2c33-3d08-4ab8-91b2-dea27b8dc05c" path="/var/lib/kubelet/pods/6fad2c33-3d08-4ab8-91b2-dea27b8dc05c/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.909458 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="825c3741-d390-4a7c-b3a6-50e268fbe712" path="/var/lib/kubelet/pods/825c3741-d390-4a7c-b3a6-50e268fbe712/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.910588 4990 generic.go:334] "Generic (PLEG): container finished" podID="cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" containerID="6adc7ab75463d0f987dece3ce8e23ed148319e34a206c17402e0f133ac4e50a8" exitCode=0 Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.910603 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ca1241-a022-4b2a-988f-79e2fd42e6aa" path="/var/lib/kubelet/pods/87ca1241-a022-4b2a-988f-79e2fd42e6aa/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.911343 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eda8f25-c4cb-4c1d-86cf-d09703e3c953" path="/var/lib/kubelet/pods/9eda8f25-c4cb-4c1d-86cf-d09703e3c953/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.911842 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3921c9c-8444-4c5f-814b-f6a26a9cf5df" path="/var/lib/kubelet/pods/e3921c9c-8444-4c5f-814b-f6a26a9cf5df/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.912455 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edcff799-1dbf-4115-9f05-3e5164b331ad" path="/var/lib/kubelet/pods/edcff799-1dbf-4115-9f05-3e5164b331ad/volumes" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.912914 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.936697 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" (UID: "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.939867 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.942582 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.947230 4990 generic.go:334] "Generic (PLEG): container finished" podID="ac157dc7-6df6-4f4f-ba65-c85b58f78fff" containerID="46e13f097dbadede2c3cc71a0e4ec9fe6d9a4c7164ffcc20da40533122fd3bc4" exitCode=0 Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.957308 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.961317 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-config-data" (OuterVolumeSpecName: "config-data") pod "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" (UID: "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.961941 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.968523 4990 generic.go:334] "Generic (PLEG): container finished" podID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerID="0cf5ac29746ce882d6ae1c7168250fbf34eb77ced233198b98d8e39f0ab37bd4" exitCode=0 Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.969393 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" (UID: "bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.973119 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell05001-account-delete-5nncz" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.976032 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.977997 4990 generic.go:334] "Generic (PLEG): container finished" podID="230b4581-35e6-4c97-9f63-73e70624bf5c" containerID="8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf" exitCode=0 Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.978092 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 10:06:02 crc kubenswrapper[4990]: I1003 10:06:02.981151 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001413 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90","Type":"ContainerDied","Data":"6adc7ab75463d0f987dece3ce8e23ed148319e34a206c17402e0f133ac4e50a8"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001462 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"56fcd909-29b5-472d-8007-84fc511ac818","Type":"ContainerDied","Data":"c77b889243516ebf608418e7d11a97d534c9b1388e38a9b0f3b4e0845ab77456"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001484 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7b682e49-2ca7-4692-b989-28dfbd26163e","Type":"ContainerDied","Data":"47b3ee5fe09c1288dcc55759d5124e0dfde3e2307c5ecb7f0af3741bbfc250d9"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001500 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971e9963-b7ee-4ee8-872a-2f696bbfdb40","Type":"ContainerDied","Data":"2320cf1b112fb76ae26987d386aaedb7324559fc85344204f40758a4889f9ef5"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001534 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ac157dc7-6df6-4f4f-ba65-c85b58f78fff","Type":"ContainerDied","Data":"46e13f097dbadede2c3cc71a0e4ec9fe6d9a4c7164ffcc20da40533122fd3bc4"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001552 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f251a942-6e8b-4f2e-a6e8-b505e4921b19","Type":"ContainerDied","Data":"e041a5113858480ecf9548c85d70cb54ae3feab2b955a40eda4bb8b2fae3153f"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001569 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerDied","Data":"0cf5ac29746ce882d6ae1c7168250fbf34eb77ced233198b98d8e39f0ab37bd4"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001588 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell05001-account-delete-5nncz" event={"ID":"5d8359ce-901e-400a-926a-b3060c2dc789","Type":"ContainerDied","Data":"61ea26503903cc9c2897b4189fd6a57cd0effd9de7477bdd9ecfe4893ef4075e"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001600 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61ea26503903cc9c2897b4189fd6a57cd0effd9de7477bdd9ecfe4893ef4075e" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001621 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afea80e6-894d-41cd-b107-926d012e9f35","Type":"ContainerDied","Data":"580bc6f927ad111e74705a2be9063fdd1335fba97dc16dee485218c69c8e7593"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001635 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"230b4581-35e6-4c97-9f63-73e70624bf5c","Type":"ContainerDied","Data":"8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001649 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"230b4581-35e6-4c97-9f63-73e70624bf5c","Type":"ContainerDied","Data":"795218cff2f7ef7fe86b2383573d93567b8f0f30d0236156bd11d2b7f2240923"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001660 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507","Type":"ContainerDied","Data":"5a13fac76cfd5dcbf371253b3e501d4b6fe86385968822d4cfdeee3500f015a1"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.001684 4990 scope.go:117] "RemoveContainer" containerID="31042aa385dfd4cdb15ed74ad929e7230170e4b65fe953249dedab85ce6fb90d" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.013386 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4xcn\" (UniqueName: \"kubernetes.io/projected/5d8359ce-901e-400a-926a-b3060c2dc789-kube-api-access-m4xcn\") pod \"5d8359ce-901e-400a-926a-b3060c2dc789\" (UID: \"5d8359ce-901e-400a-926a-b3060c2dc789\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.017634 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8359ce-901e-400a-926a-b3060c2dc789-kube-api-access-m4xcn" (OuterVolumeSpecName: "kube-api-access-m4xcn") pod "5d8359ce-901e-400a-926a-b3060c2dc789" (UID: "5d8359ce-901e-400a-926a-b3060c2dc789"). InnerVolumeSpecName "kube-api-access-m4xcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.017723 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-config-data\") pod \"230b4581-35e6-4c97-9f63-73e70624bf5c\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.017779 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-combined-ca-bundle\") pod \"230b4581-35e6-4c97-9f63-73e70624bf5c\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.017802 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4064799a-3601-4426-a225-151729d11c97","Type":"ContainerDied","Data":"20d2d25ce9bcc245f400e40ad8c0300acef06aacb84e0cd069565068a37a1714"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.017973 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcrrk\" (UniqueName: \"kubernetes.io/projected/230b4581-35e6-4c97-9f63-73e70624bf5c-kube-api-access-hcrrk\") pod \"230b4581-35e6-4c97-9f63-73e70624bf5c\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018006 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-kolla-config\") pod \"230b4581-35e6-4c97-9f63-73e70624bf5c\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.017771 4990 generic.go:334] "Generic (PLEG): container finished" podID="4064799a-3601-4426-a225-151729d11c97" containerID="20d2d25ce9bcc245f400e40ad8c0300acef06aacb84e0cd069565068a37a1714" exitCode=0 Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018098 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-memcached-tls-certs\") pod \"230b4581-35e6-4c97-9f63-73e70624bf5c\" (UID: \"230b4581-35e6-4c97-9f63-73e70624bf5c\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018140 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvlsr\" (UniqueName: \"kubernetes.io/projected/c64d7262-ab55-4d88-bb9c-02825e07721a-kube-api-access-nvlsr\") pod \"c64d7262-ab55-4d88-bb9c-02825e07721a\" (UID: \"c64d7262-ab55-4d88-bb9c-02825e07721a\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018098 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4064799a-3601-4426-a225-151729d11c97","Type":"ContainerDied","Data":"75b7dd6289c25347a32f40d5346be61b68d580ba8e6cf62df4948b9e821f6bbb"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018473 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75b7dd6289c25347a32f40d5346be61b68d580ba8e6cf62df4948b9e821f6bbb" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018558 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-config-data" (OuterVolumeSpecName: "config-data") pod "230b4581-35e6-4c97-9f63-73e70624bf5c" (UID: "230b4581-35e6-4c97-9f63-73e70624bf5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018928 4990 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018954 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018969 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018982 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4xcn\" (UniqueName: \"kubernetes.io/projected/5d8359ce-901e-400a-926a-b3060c2dc789-kube-api-access-m4xcn\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.018995 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.019593 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "230b4581-35e6-4c97-9f63-73e70624bf5c" (UID: "230b4581-35e6-4c97-9f63-73e70624bf5c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.023319 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230b4581-35e6-4c97-9f63-73e70624bf5c-kube-api-access-hcrrk" (OuterVolumeSpecName: "kube-api-access-hcrrk") pod "230b4581-35e6-4c97-9f63-73e70624bf5c" (UID: "230b4581-35e6-4c97-9f63-73e70624bf5c"). InnerVolumeSpecName "kube-api-access-hcrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.023361 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64d7262-ab55-4d88-bb9c-02825e07721a-kube-api-access-nvlsr" (OuterVolumeSpecName: "kube-api-access-nvlsr") pod "c64d7262-ab55-4d88-bb9c-02825e07721a" (UID: "c64d7262-ab55-4d88-bb9c-02825e07721a"). InnerVolumeSpecName "kube-api-access-nvlsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.027056 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.034799 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron7b28-account-delete-bg9sn" event={"ID":"c64d7262-ab55-4d88-bb9c-02825e07721a","Type":"ContainerDied","Data":"f65da99de0e0433908e8bb864cba35e548a42afa842556d9a02e62fc7b8a58e9"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.034834 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f65da99de0e0433908e8bb864cba35e548a42afa842556d9a02e62fc7b8a58e9" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.034882 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron7b28-account-delete-bg9sn" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.049275 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff64b77bd-5qpwf" event={"ID":"ebb4021b-c9ef-4b31-864d-d4874b51e47c","Type":"ContainerDied","Data":"1b3039dbb00e8d2505870cd5c3f0794059ebcdb82f117d2e9019b0d60938ce5a"} Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.049370 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff64b77bd-5qpwf" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.050664 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "230b4581-35e6-4c97-9f63-73e70624bf5c" (UID: "230b4581-35e6-4c97-9f63-73e70624bf5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.080682 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "230b4581-35e6-4c97-9f63-73e70624bf5c" (UID: "230b4581-35e6-4c97-9f63-73e70624bf5c"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.115024 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.127391 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-scripts\") pod \"4064799a-3601-4426-a225-151729d11c97\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.127641 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-logs\") pod \"afea80e6-894d-41cd-b107-926d012e9f35\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.127773 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.127775 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwzwb\" (UniqueName: \"kubernetes.io/projected/afea80e6-894d-41cd-b107-926d012e9f35-kube-api-access-dwzwb\") pod \"afea80e6-894d-41cd-b107-926d012e9f35\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.127859 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-scripts\") pod \"afea80e6-894d-41cd-b107-926d012e9f35\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.127915 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data\") pod \"4064799a-3601-4426-a225-151729d11c97\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.127943 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4064799a-3601-4426-a225-151729d11c97-etc-machine-id\") pod \"4064799a-3601-4426-a225-151729d11c97\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.127991 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-public-tls-certs\") pod \"afea80e6-894d-41cd-b107-926d012e9f35\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.128043 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-httpd-run\") pod \"afea80e6-894d-41cd-b107-926d012e9f35\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.128073 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data-custom\") pod \"4064799a-3601-4426-a225-151729d11c97\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.128120 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-combined-ca-bundle\") pod \"afea80e6-894d-41cd-b107-926d012e9f35\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.128158 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-config-data\") pod \"afea80e6-894d-41cd-b107-926d012e9f35\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.128187 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhcgf\" (UniqueName: \"kubernetes.io/projected/4064799a-3601-4426-a225-151729d11c97-kube-api-access-dhcgf\") pod \"4064799a-3601-4426-a225-151729d11c97\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.128216 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-combined-ca-bundle\") pod \"4064799a-3601-4426-a225-151729d11c97\" (UID: \"4064799a-3601-4426-a225-151729d11c97\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.128240 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"afea80e6-894d-41cd-b107-926d012e9f35\" (UID: \"afea80e6-894d-41cd-b107-926d012e9f35\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.129631 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-logs" (OuterVolumeSpecName: "logs") pod "afea80e6-894d-41cd-b107-926d012e9f35" (UID: "afea80e6-894d-41cd-b107-926d012e9f35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.131623 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-scripts" (OuterVolumeSpecName: "scripts") pod "4064799a-3601-4426-a225-151729d11c97" (UID: "4064799a-3601-4426-a225-151729d11c97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.132268 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4064799a-3601-4426-a225-151729d11c97-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4064799a-3601-4426-a225-151729d11c97" (UID: "4064799a-3601-4426-a225-151729d11c97"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.133376 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "afea80e6-894d-41cd-b107-926d012e9f35" (UID: "afea80e6-894d-41cd-b107-926d012e9f35"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.134363 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4064799a-3601-4426-a225-151729d11c97" (UID: "4064799a-3601-4426-a225-151729d11c97"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.135410 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4064799a-3601-4426-a225-151729d11c97-kube-api-access-dhcgf" (OuterVolumeSpecName: "kube-api-access-dhcgf") pod "4064799a-3601-4426-a225-151729d11c97" (UID: "4064799a-3601-4426-a225-151729d11c97"). InnerVolumeSpecName "kube-api-access-dhcgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.136697 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "afea80e6-894d-41cd-b107-926d012e9f35" (UID: "afea80e6-894d-41cd-b107-926d012e9f35"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.142419 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.144621 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-scripts" (OuterVolumeSpecName: "scripts") pod "afea80e6-894d-41cd-b107-926d012e9f35" (UID: "afea80e6-894d-41cd-b107-926d012e9f35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.149454 4990 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.149553 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvlsr\" (UniqueName: \"kubernetes.io/projected/c64d7262-ab55-4d88-bb9c-02825e07721a-kube-api-access-nvlsr\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.149572 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230b4581-35e6-4c97-9f63-73e70624bf5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.149584 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcrrk\" (UniqueName: \"kubernetes.io/projected/230b4581-35e6-4c97-9f63-73e70624bf5c-kube-api-access-hcrrk\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.149623 4990 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/230b4581-35e6-4c97-9f63-73e70624bf5c-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.151506 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.151607 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data podName:f6624a04-5ca4-4651-a91e-0a67f97c51b5 nodeName:}" failed. No retries permitted until 2025-10-03 10:06:11.151584781 +0000 UTC m=+1352.948216708 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data") pod "rabbitmq-cell1-server-0" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5") : configmap "rabbitmq-cell1-config-data" not found Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.169740 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afea80e6-894d-41cd-b107-926d012e9f35-kube-api-access-dwzwb" (OuterVolumeSpecName: "kube-api-access-dwzwb") pod "afea80e6-894d-41cd-b107-926d012e9f35" (UID: "afea80e6-894d-41cd-b107-926d012e9f35"). InnerVolumeSpecName "kube-api-access-dwzwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.169858 4990 scope.go:117] "RemoveContainer" containerID="5af4d58eb46b303caa40337b751ef9459fc70d7f1edca5370cef7c7ece2d18ae" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.183492 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.199545 4990 scope.go:117] "RemoveContainer" containerID="3eac1b1500d053663746a028e055851b9a9cb1d22a20f4428e053e6a9d74b23f" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.212185 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.217006 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afea80e6-894d-41cd-b107-926d012e9f35" (UID: "afea80e6-894d-41cd-b107-926d012e9f35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.226816 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-config-data" (OuterVolumeSpecName: "config-data") pod "afea80e6-894d-41cd-b107-926d012e9f35" (UID: "afea80e6-894d-41cd-b107-926d012e9f35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.251221 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-config-data\") pod \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.251458 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrdtr\" (UniqueName: \"kubernetes.io/projected/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-kube-api-access-hrdtr\") pod \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.251628 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle\") pod \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.251676 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-combined-ca-bundle\") pod \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.251728 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-config-data\") pod \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.251763 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd97w\" (UniqueName: \"kubernetes.io/projected/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-kube-api-access-qd97w\") pod \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\" (UID: \"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252804 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4064799a-3601-4426-a225-151729d11c97-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252826 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252837 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252853 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252865 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252876 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhcgf\" (UniqueName: \"kubernetes.io/projected/4064799a-3601-4426-a225-151729d11c97-kube-api-access-dhcgf\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252907 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252919 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252931 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afea80e6-894d-41cd-b107-926d012e9f35-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252949 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwzwb\" (UniqueName: \"kubernetes.io/projected/afea80e6-894d-41cd-b107-926d012e9f35-kube-api-access-dwzwb\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.252959 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.286890 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle podName:ac157dc7-6df6-4f4f-ba65-c85b58f78fff nodeName:}" failed. No retries permitted until 2025-10-03 10:06:03.786842382 +0000 UTC m=+1345.583474239 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle") pod "ac157dc7-6df6-4f4f-ba65-c85b58f78fff" (UID: "ac157dc7-6df6-4f4f-ba65-c85b58f78fff") : error deleting /var/lib/kubelet/pods/ac157dc7-6df6-4f4f-ba65-c85b58f78fff/volume-subpaths: remove /var/lib/kubelet/pods/ac157dc7-6df6-4f4f-ba65-c85b58f78fff/volume-subpaths: no such file or directory Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.287886 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-kube-api-access-qd97w" (OuterVolumeSpecName: "kube-api-access-qd97w") pod "cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" (UID: "cbe27b2c-9f5a-4687-bff9-8a36d03f8a90"). InnerVolumeSpecName "kube-api-access-qd97w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.289815 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4064799a-3601-4426-a225-151729d11c97" (UID: "4064799a-3601-4426-a225-151729d11c97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.291686 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.292487 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-kube-api-access-hrdtr" (OuterVolumeSpecName: "kube-api-access-hrdtr") pod "ac157dc7-6df6-4f4f-ba65-c85b58f78fff" (UID: "ac157dc7-6df6-4f4f-ba65-c85b58f78fff"). InnerVolumeSpecName "kube-api-access-hrdtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.293997 4990 scope.go:117] "RemoveContainer" containerID="de83fa39f0019c8400c38ca9ffa8825e384c428252e8724d9b3c68ce695b214d" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.306645 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.312170 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-config-data" (OuterVolumeSpecName: "config-data") pod "ac157dc7-6df6-4f4f-ba65-c85b58f78fff" (UID: "ac157dc7-6df6-4f4f-ba65-c85b58f78fff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.343715 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-config-data" (OuterVolumeSpecName: "config-data") pod "cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" (UID: "cbe27b2c-9f5a-4687-bff9-8a36d03f8a90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.344228 4990 scope.go:117] "RemoveContainer" containerID="b9379e1b8b4ae7edb841ea4eecf2f3729d7382db2b0cb9c51bafb1386c50c7f8" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.352873 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" (UID: "cbe27b2c-9f5a-4687-bff9-8a36d03f8a90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.358683 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.358722 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrdtr\" (UniqueName: \"kubernetes.io/projected/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-kube-api-access-hrdtr\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.358762 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.358775 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.358788 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.358799 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.358810 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd97w\" (UniqueName: \"kubernetes.io/projected/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90-kube-api-access-qd97w\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.368737 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "afea80e6-894d-41cd-b107-926d012e9f35" (UID: "afea80e6-894d-41cd-b107-926d012e9f35"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.370207 4990 scope.go:117] "RemoveContainer" containerID="f7d3bf56d474b40153f72d5bc27483a5ac6dcc6d775e56fb75b8e6acd314a813" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.399773 4990 scope.go:117] "RemoveContainer" containerID="209b540fbfc89585fd9ea919729898d162e3563717dbbf86e50b0e3bf191be0e" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.404758 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data" (OuterVolumeSpecName: "config-data") pod "4064799a-3601-4426-a225-151729d11c97" (UID: "4064799a-3601-4426-a225-151729d11c97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.410563 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.419742 4990 scope.go:117] "RemoveContainer" containerID="88d641d1bf486138095cda080880426b764b3efa9d3577891d4ca40c95c3393d" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.420980 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.427568 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7ff64b77bd-5qpwf"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.432687 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7ff64b77bd-5qpwf"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.442231 4990 scope.go:117] "RemoveContainer" containerID="15879b52a5447c4f1318e71b94d644d6fa2c2d384c4a618bb98821e956651178" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.442583 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.448919 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.453109 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.457374 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.460138 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4064799a-3601-4426-a225-151729d11c97-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.460355 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afea80e6-894d-41cd-b107-926d012e9f35-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.460334 4990 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.460551 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data podName:51461d28-e850-4ba3-8f27-0252b51903f1 nodeName:}" failed. No retries permitted until 2025-10-03 10:06:11.460532219 +0000 UTC m=+1353.257164076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data") pod "rabbitmq-server-0" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1") : configmap "rabbitmq-config-data" not found Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.462214 4990 scope.go:117] "RemoveContainer" containerID="8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.469928 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron7b28-account-delete-bg9sn"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.473122 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron7b28-account-delete-bg9sn"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.476200 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell05001-account-delete-5nncz"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.480337 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell05001-account-delete-5nncz"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.484906 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.493173 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.578685 4990 scope.go:117] "RemoveContainer" containerID="8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf" Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.584267 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf\": container with ID starting with 8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf not found: ID does not exist" containerID="8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.584314 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf"} err="failed to get container status \"8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf\": rpc error: code = NotFound desc = could not find container \"8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf\": container with ID starting with 8ba37773f4d5101c6ab9e34c9514ce898c395fdde5008f4d8091f23f79118faf not found: ID does not exist" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.584371 4990 scope.go:117] "RemoveContainer" containerID="e50c76114e970456369b10767db9ea929de83f5f9f49ec0e5b445f383d8fe4e0" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.623567 4990 scope.go:117] "RemoveContainer" containerID="5c91cb2d9f2d0bdc8ec0998044ab7005f99f9cf45e79613caddeaf65988ab63e" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.628158 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.631574 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.648061 4990 scope.go:117] "RemoveContainer" containerID="2f9c4a7944eae6a958448c1d084cbafac9f94669c8762fb52bb88d1d7c1f256d" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.675476 4990 scope.go:117] "RemoveContainer" containerID="dd31dfd8ea225e9fe468b78acfe34ecc14ae0b5990b5d3fa0f4371f27dad6f4c" Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.769844 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.769896 4990 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.769926 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:06:11.769908478 +0000 UTC m=+1353.566540335 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-config" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.769861 4990 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.769964 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts podName:b23a7883-8397-4262-a891-916de94739fd nodeName:}" failed. No retries permitted until 2025-10-03 10:06:11.769947829 +0000 UTC m=+1353.566579686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts") pod "ovn-northd-0" (UID: "b23a7883-8397-4262-a891-916de94739fd") : configmap "ovnnorthd-scripts" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.770008 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:06:11.76999496 +0000 UTC m=+1353.566626817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-config" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.770025 4990 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 03 10:06:03 crc kubenswrapper[4990]: E1003 10:06:03.770164 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:06:11.770132734 +0000 UTC m=+1353.566764591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-httpd-config" not found Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.870714 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle\") pod \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\" (UID: \"ac157dc7-6df6-4f4f-ba65-c85b58f78fff\") " Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.875224 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac157dc7-6df6-4f4f-ba65-c85b58f78fff" (UID: "ac157dc7-6df6-4f4f-ba65-c85b58f78fff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:03 crc kubenswrapper[4990]: I1003 10:06:03.985950 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac157dc7-6df6-4f4f-ba65-c85b58f78fff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.071797 4990 generic.go:334] "Generic (PLEG): container finished" podID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" containerID="72084d24cc256a164d380470d3a517d6d49179f56ce62666893f00f00964d3bf" exitCode=0 Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.071857 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fe31a60-7e5f-40a8-acf3-d7a17c210e74","Type":"ContainerDied","Data":"72084d24cc256a164d380470d3a517d6d49179f56ce62666893f00f00964d3bf"} Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.096978 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ac157dc7-6df6-4f4f-ba65-c85b58f78fff","Type":"ContainerDied","Data":"a039d06cf9ab0385593ea4c8b98b9203f4e79c71af9d9a55959c54b61e5366cd"} Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.097021 4990 scope.go:117] "RemoveContainer" containerID="46e13f097dbadede2c3cc71a0e4ec9fe6d9a4c7164ffcc20da40533122fd3bc4" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.097128 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.109132 4990 generic.go:334] "Generic (PLEG): container finished" podID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" containerID="39ca9311a2a836a4c7c4e3966e9e9682edd65a8a88709915dec6b0db579a3d63" exitCode=0 Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.109233 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6624a04-5ca4-4651-a91e-0a67f97c51b5","Type":"ContainerDied","Data":"39ca9311a2a836a4c7c4e3966e9e9682edd65a8a88709915dec6b0db579a3d63"} Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.111074 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbe27b2c-9f5a-4687-bff9-8a36d03f8a90","Type":"ContainerDied","Data":"066fbe0f8a16662f328dda1a940e255130607cbaf18ba4213d35893e4bfd8ccf"} Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.111205 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.119827 4990 generic.go:334] "Generic (PLEG): container finished" podID="51461d28-e850-4ba3-8f27-0252b51903f1" containerID="6e738f47b293c30c60c0c8652362ba0397b75b7dc42d631b6865d77252a55f68" exitCode=0 Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.119924 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51461d28-e850-4ba3-8f27-0252b51903f1","Type":"ContainerDied","Data":"6e738f47b293c30c60c0c8652362ba0397b75b7dc42d631b6865d77252a55f68"} Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.119951 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.187264 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.204333 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.226821 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.237580 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.241741 4990 scope.go:117] "RemoveContainer" containerID="6adc7ab75463d0f987dece3ce8e23ed148319e34a206c17402e0f133ac4e50a8" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.244163 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.250172 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.387016 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507133 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-plugins-conf\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507196 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-server-conf\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507246 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5wdc\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-kube-api-access-c5wdc\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507354 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-erlang-cookie\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507405 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51461d28-e850-4ba3-8f27-0252b51903f1-pod-info\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507447 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-tls\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507469 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507544 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-confd\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507580 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51461d28-e850-4ba3-8f27-0252b51903f1-erlang-cookie-secret\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507622 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.507652 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-plugins\") pod \"51461d28-e850-4ba3-8f27-0252b51903f1\" (UID: \"51461d28-e850-4ba3-8f27-0252b51903f1\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.508553 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.513047 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/51461d28-e850-4ba3-8f27-0252b51903f1-pod-info" (OuterVolumeSpecName: "pod-info") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.513541 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.513828 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.517651 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51461d28-e850-4ba3-8f27-0252b51903f1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.521033 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.521242 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.524865 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-kube-api-access-c5wdc" (OuterVolumeSpecName: "kube-api-access-c5wdc") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "kube-api-access-c5wdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.560188 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data" (OuterVolumeSpecName: "config-data") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.601840 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-server-conf" (OuterVolumeSpecName: "server-conf") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612730 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612761 4990 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51461d28-e850-4ba3-8f27-0252b51903f1-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612771 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612798 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612807 4990 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51461d28-e850-4ba3-8f27-0252b51903f1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612816 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612824 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612832 4990 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612840 4990 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51461d28-e850-4ba3-8f27-0252b51903f1-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.612848 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5wdc\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-kube-api-access-c5wdc\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.632205 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.634065 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "51461d28-e850-4ba3-8f27-0252b51903f1" (UID: "51461d28-e850-4ba3-8f27-0252b51903f1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.715077 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.715503 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51461d28-e850-4ba3-8f27-0252b51903f1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: E1003 10:06:04.763384 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:04 crc kubenswrapper[4990]: E1003 10:06:04.763966 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:04 crc kubenswrapper[4990]: E1003 10:06:04.764210 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:04 crc kubenswrapper[4990]: E1003 10:06:04.764242 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:06:04 crc kubenswrapper[4990]: E1003 10:06:04.765259 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:04 crc kubenswrapper[4990]: E1003 10:06:04.766369 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:04 crc kubenswrapper[4990]: E1003 10:06:04.767557 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:04 crc kubenswrapper[4990]: E1003 10:06:04.767580 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.781481 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nfzkg" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" probeResult="failure" output="command timed out" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.792640 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.803358 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.846780 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nfzkg" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" probeResult="failure" output=< Oct 03 10:06:04 crc kubenswrapper[4990]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Oct 03 10:06:04 crc kubenswrapper[4990]: > Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.892847 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="230b4581-35e6-4c97-9f63-73e70624bf5c" path="/var/lib/kubelet/pods/230b4581-35e6-4c97-9f63-73e70624bf5c/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.894119 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4064799a-3601-4426-a225-151729d11c97" path="/var/lib/kubelet/pods/4064799a-3601-4426-a225-151729d11c97/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.895321 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56fcd909-29b5-472d-8007-84fc511ac818" path="/var/lib/kubelet/pods/56fcd909-29b5-472d-8007-84fc511ac818/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.898957 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8359ce-901e-400a-926a-b3060c2dc789" path="/var/lib/kubelet/pods/5d8359ce-901e-400a-926a-b3060c2dc789/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.900688 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" path="/var/lib/kubelet/pods/7b682e49-2ca7-4692-b989-28dfbd26163e/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.901696 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" path="/var/lib/kubelet/pods/971e9963-b7ee-4ee8-872a-2f696bbfdb40/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.905162 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac157dc7-6df6-4f4f-ba65-c85b58f78fff" path="/var/lib/kubelet/pods/ac157dc7-6df6-4f4f-ba65-c85b58f78fff/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.905847 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afea80e6-894d-41cd-b107-926d012e9f35" path="/var/lib/kubelet/pods/afea80e6-894d-41cd-b107-926d012e9f35/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.906803 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" path="/var/lib/kubelet/pods/bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.909461 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c64d7262-ab55-4d88-bb9c-02825e07721a" path="/var/lib/kubelet/pods/c64d7262-ab55-4d88-bb9c-02825e07721a/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.910147 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" path="/var/lib/kubelet/pods/cbe27b2c-9f5a-4687-bff9-8a36d03f8a90/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.911834 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" path="/var/lib/kubelet/pods/ebb4021b-c9ef-4b31-864d-d4874b51e47c/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.914356 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" path="/var/lib/kubelet/pods/f251a942-6e8b-4f2e-a6e8-b505e4921b19/volumes" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919201 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6624a04-5ca4-4651-a91e-0a67f97c51b5-pod-info\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919244 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-plugins\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919261 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919288 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-galera-tls-certs\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919364 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-plugins-conf\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919408 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6624a04-5ca4-4651-a91e-0a67f97c51b5-erlang-cookie-secret\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919448 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-default\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919471 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kolla-config\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919502 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-erlang-cookie\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919535 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-server-conf\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919553 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mdd4\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-kube-api-access-4mdd4\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919585 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919605 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-tls\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919622 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-combined-ca-bundle\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919756 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-operator-scripts\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919764 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919779 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-secrets\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919846 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gskbm\" (UniqueName: \"kubernetes.io/projected/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kube-api-access-gskbm\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919890 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-generated\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.919961 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-confd\") pod \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\" (UID: \"f6624a04-5ca4-4651-a91e-0a67f97c51b5\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.920011 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\" (UID: \"8fe31a60-7e5f-40a8-acf3-d7a17c210e74\") " Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.920109 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.922107 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.925470 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.925523 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.925536 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.926479 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.927915 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.930426 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.931370 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.931547 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.933136 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f6624a04-5ca4-4651-a91e-0a67f97c51b5-pod-info" (OuterVolumeSpecName: "pod-info") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.933221 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kube-api-access-gskbm" (OuterVolumeSpecName: "kube-api-access-gskbm") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "kube-api-access-gskbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.935280 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6624a04-5ca4-4651-a91e-0a67f97c51b5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.935582 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.935690 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-secrets" (OuterVolumeSpecName: "secrets") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.937369 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-kube-api-access-4mdd4" (OuterVolumeSpecName: "kube-api-access-4mdd4") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "kube-api-access-4mdd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.985647 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.986787 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data" (OuterVolumeSpecName: "config-data") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:04 crc kubenswrapper[4990]: I1003 10:06:04.990243 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029488 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029533 4990 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029542 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mdd4\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-kube-api-access-4mdd4\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029552 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029560 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029568 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029576 4990 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029584 4990 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029592 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gskbm\" (UniqueName: \"kubernetes.io/projected/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-kube-api-access-gskbm\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029610 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029619 4990 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f6624a04-5ca4-4651-a91e-0a67f97c51b5-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029631 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029639 4990 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.029648 4990 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f6624a04-5ca4-4651-a91e-0a67f97c51b5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.046461 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.048220 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.083724 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.098566 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8fe31a60-7e5f-40a8-acf3-d7a17c210e74" (UID: "8fe31a60-7e5f-40a8-acf3-d7a17c210e74"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.131226 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.131252 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.131263 4990 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fe31a60-7e5f-40a8-acf3-d7a17c210e74-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.131877 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-server-conf" (OuterVolumeSpecName: "server-conf") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.132209 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b23a7883-8397-4262-a891-916de94739fd/ovn-northd/0.log" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.132266 4990 generic.go:334] "Generic (PLEG): container finished" podID="b23a7883-8397-4262-a891-916de94739fd" containerID="7871e236e74cb6ae1f5cad66ad4b89c2125e25150e40edb22d50c75bed041cb2" exitCode=139 Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.132413 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b23a7883-8397-4262-a891-916de94739fd","Type":"ContainerDied","Data":"7871e236e74cb6ae1f5cad66ad4b89c2125e25150e40edb22d50c75bed041cb2"} Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.142961 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fe31a60-7e5f-40a8-acf3-d7a17c210e74","Type":"ContainerDied","Data":"21b2eb22da5beaa5673b813cddc311554ce933b5161e40ba8798952257c54e8d"} Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.143023 4990 scope.go:117] "RemoveContainer" containerID="72084d24cc256a164d380470d3a517d6d49179f56ce62666893f00f00964d3bf" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.143195 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.150504 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f6624a04-5ca4-4651-a91e-0a67f97c51b5","Type":"ContainerDied","Data":"1ce8535c7e0fcadee3e8f2222a51e30a7d30339d8968e3832a95ef56d642fb00"} Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.150903 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.156934 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51461d28-e850-4ba3-8f27-0252b51903f1","Type":"ContainerDied","Data":"f8342d53d9a73416382f9c581e30435c715537b9887674feb92ac0b3b7083a77"} Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.157064 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.160958 4990 generic.go:334] "Generic (PLEG): container finished" podID="b2d5088c-5854-4bee-9e3c-8198d4b7d377" containerID="f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835" exitCode=0 Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.161015 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6897c54f48-kp6tm" event={"ID":"b2d5088c-5854-4bee-9e3c-8198d4b7d377","Type":"ContainerDied","Data":"f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835"} Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.161031 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6897c54f48-kp6tm" event={"ID":"b2d5088c-5854-4bee-9e3c-8198d4b7d377","Type":"ContainerDied","Data":"b6b6e7d4741443d6712fd60c34ea6865f02f11ff44c5f45d9705dcc570ca396f"} Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.161072 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6897c54f48-kp6tm" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.172195 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f6624a04-5ca4-4651-a91e-0a67f97c51b5" (UID: "f6624a04-5ca4-4651-a91e-0a67f97c51b5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: E1003 10:06:05.199729 4990 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 03 10:06:05 crc kubenswrapper[4990]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-03T10:05:58Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 03 10:06:05 crc kubenswrapper[4990]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 03 10:06:05 crc kubenswrapper[4990]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-nfzkg" message=< Oct 03 10:06:05 crc kubenswrapper[4990]: Exiting ovn-controller (1) [FAILED] Oct 03 10:06:05 crc kubenswrapper[4990]: Killing ovn-controller (1) [ OK ] Oct 03 10:06:05 crc kubenswrapper[4990]: Killing ovn-controller (1) with SIGKILL [ OK ] Oct 03 10:06:05 crc kubenswrapper[4990]: 2025-10-03T10:05:58Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 03 10:06:05 crc kubenswrapper[4990]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 03 10:06:05 crc kubenswrapper[4990]: > Oct 03 10:06:05 crc kubenswrapper[4990]: E1003 10:06:05.199784 4990 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 03 10:06:05 crc kubenswrapper[4990]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-03T10:05:58Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 03 10:06:05 crc kubenswrapper[4990]: /etc/init.d/functions: line 589: 421 Alarm clock "$@" Oct 03 10:06:05 crc kubenswrapper[4990]: > pod="openstack/ovn-controller-nfzkg" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" containerID="cri-o://626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.199832 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-nfzkg" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" containerID="cri-o://626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce" gracePeriod=22 Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.232743 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-internal-tls-certs\") pod \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233201 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpg2j\" (UniqueName: \"kubernetes.io/projected/b2d5088c-5854-4bee-9e3c-8198d4b7d377-kube-api-access-wpg2j\") pod \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233324 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-credential-keys\") pod \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233360 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-scripts\") pod \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233398 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-config-data\") pod \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233474 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-public-tls-certs\") pod \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233498 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-combined-ca-bundle\") pod \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233530 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-fernet-keys\") pod \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\" (UID: \"b2d5088c-5854-4bee-9e3c-8198d4b7d377\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233882 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f6624a04-5ca4-4651-a91e-0a67f97c51b5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.233894 4990 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f6624a04-5ca4-4651-a91e-0a67f97c51b5-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.252302 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b2d5088c-5854-4bee-9e3c-8198d4b7d377" (UID: "b2d5088c-5854-4bee-9e3c-8198d4b7d377"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.258907 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b2d5088c-5854-4bee-9e3c-8198d4b7d377" (UID: "b2d5088c-5854-4bee-9e3c-8198d4b7d377"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.258945 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-scripts" (OuterVolumeSpecName: "scripts") pod "b2d5088c-5854-4bee-9e3c-8198d4b7d377" (UID: "b2d5088c-5854-4bee-9e3c-8198d4b7d377"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.259439 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d5088c-5854-4bee-9e3c-8198d4b7d377-kube-api-access-wpg2j" (OuterVolumeSpecName: "kube-api-access-wpg2j") pod "b2d5088c-5854-4bee-9e3c-8198d4b7d377" (UID: "b2d5088c-5854-4bee-9e3c-8198d4b7d377"). InnerVolumeSpecName "kube-api-access-wpg2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.279228 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-config-data" (OuterVolumeSpecName: "config-data") pod "b2d5088c-5854-4bee-9e3c-8198d4b7d377" (UID: "b2d5088c-5854-4bee-9e3c-8198d4b7d377"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.282806 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2d5088c-5854-4bee-9e3c-8198d4b7d377" (UID: "b2d5088c-5854-4bee-9e3c-8198d4b7d377"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.292773 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b2d5088c-5854-4bee-9e3c-8198d4b7d377" (UID: "b2d5088c-5854-4bee-9e3c-8198d4b7d377"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.295591 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b2d5088c-5854-4bee-9e3c-8198d4b7d377" (UID: "b2d5088c-5854-4bee-9e3c-8198d4b7d377"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.335466 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.336731 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.336810 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpg2j\" (UniqueName: \"kubernetes.io/projected/b2d5088c-5854-4bee-9e3c-8198d4b7d377-kube-api-access-wpg2j\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.336892 4990 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.337072 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.337163 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.337275 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.337351 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.337423 4990 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d5088c-5854-4bee-9e3c-8198d4b7d377-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.344355 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.350697 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.355420 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.391174 4990 scope.go:117] "RemoveContainer" containerID="6304bb47401c2e416b699ae72b910fc4eb114553fd6d296ac55983e6561263e9" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.419785 4990 scope.go:117] "RemoveContainer" containerID="39ca9311a2a836a4c7c4e3966e9e9682edd65a8a88709915dec6b0db579a3d63" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.422214 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b23a7883-8397-4262-a891-916de94739fd/ovn-northd/0.log" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.422285 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.457395 4990 scope.go:117] "RemoveContainer" containerID="46a2006f531297eb507c3d080522e05f935cfe53f8d27382af0ef0806a9315a1" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.543317 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts\") pod \"b23a7883-8397-4262-a891-916de94739fd\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.543368 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b23a7883-8397-4262-a891-916de94739fd-ovn-rundir\") pod \"b23a7883-8397-4262-a891-916de94739fd\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.543421 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config\") pod \"b23a7883-8397-4262-a891-916de94739fd\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.543455 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-metrics-certs-tls-certs\") pod \"b23a7883-8397-4262-a891-916de94739fd\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.543490 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7fm8\" (UniqueName: \"kubernetes.io/projected/b23a7883-8397-4262-a891-916de94739fd-kube-api-access-s7fm8\") pod \"b23a7883-8397-4262-a891-916de94739fd\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.543528 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-ovn-northd-tls-certs\") pod \"b23a7883-8397-4262-a891-916de94739fd\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.543587 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-combined-ca-bundle\") pod \"b23a7883-8397-4262-a891-916de94739fd\" (UID: \"b23a7883-8397-4262-a891-916de94739fd\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.544019 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts" (OuterVolumeSpecName: "scripts") pod "b23a7883-8397-4262-a891-916de94739fd" (UID: "b23a7883-8397-4262-a891-916de94739fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.545033 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b23a7883-8397-4262-a891-916de94739fd-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b23a7883-8397-4262-a891-916de94739fd" (UID: "b23a7883-8397-4262-a891-916de94739fd"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.545399 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config" (OuterVolumeSpecName: "config") pod "b23a7883-8397-4262-a891-916de94739fd" (UID: "b23a7883-8397-4262-a891-916de94739fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.562953 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b23a7883-8397-4262-a891-916de94739fd-kube-api-access-s7fm8" (OuterVolumeSpecName: "kube-api-access-s7fm8") pod "b23a7883-8397-4262-a891-916de94739fd" (UID: "b23a7883-8397-4262-a891-916de94739fd"). InnerVolumeSpecName "kube-api-access-s7fm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.585097 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b23a7883-8397-4262-a891-916de94739fd" (UID: "b23a7883-8397-4262-a891-916de94739fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.646876 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.647992 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b23a7883-8397-4262-a891-916de94739fd-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.648469 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b23a7883-8397-4262-a891-916de94739fd-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.648552 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7fm8\" (UniqueName: \"kubernetes.io/projected/b23a7883-8397-4262-a891-916de94739fd-kube-api-access-s7fm8\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.648621 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.659714 4990 scope.go:117] "RemoveContainer" containerID="6e738f47b293c30c60c0c8652362ba0397b75b7dc42d631b6865d77252a55f68" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.667623 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "b23a7883-8397-4262-a891-916de94739fd" (UID: "b23a7883-8397-4262-a891-916de94739fd"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.699581 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b23a7883-8397-4262-a891-916de94739fd" (UID: "b23a7883-8397-4262-a891-916de94739fd"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.724465 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.734698 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.734974 4990 scope.go:117] "RemoveContainer" containerID="b4d7bb564bea84a147b666614fdc109dbffb801e15bd785bda000207cddae019" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.738499 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.746583 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.751664 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x88v7\" (UniqueName: \"kubernetes.io/projected/1021ae3d-46d5-481e-b844-9086f9d8f946-kube-api-access-x88v7\") pod \"1021ae3d-46d5-481e-b844-9086f9d8f946\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.751739 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data-custom\") pod \"1021ae3d-46d5-481e-b844-9086f9d8f946\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.751802 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-combined-ca-bundle\") pod \"1021ae3d-46d5-481e-b844-9086f9d8f946\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.751871 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data\") pod \"e56fc3e5-d30b-4486-978a-46a13a5657e6\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.751965 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e56fc3e5-d30b-4486-978a-46a13a5657e6-logs\") pod \"e56fc3e5-d30b-4486-978a-46a13a5657e6\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.751988 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1021ae3d-46d5-481e-b844-9086f9d8f946-logs\") pod \"1021ae3d-46d5-481e-b844-9086f9d8f946\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.752400 4990 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.752419 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23a7883-8397-4262-a891-916de94739fd-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.753054 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1021ae3d-46d5-481e-b844-9086f9d8f946-logs" (OuterVolumeSpecName: "logs") pod "1021ae3d-46d5-481e-b844-9086f9d8f946" (UID: "1021ae3d-46d5-481e-b844-9086f9d8f946"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.757140 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e56fc3e5-d30b-4486-978a-46a13a5657e6-logs" (OuterVolumeSpecName: "logs") pod "e56fc3e5-d30b-4486-978a-46a13a5657e6" (UID: "e56fc3e5-d30b-4486-978a-46a13a5657e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.761137 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1021ae3d-46d5-481e-b844-9086f9d8f946-kube-api-access-x88v7" (OuterVolumeSpecName: "kube-api-access-x88v7") pod "1021ae3d-46d5-481e-b844-9086f9d8f946" (UID: "1021ae3d-46d5-481e-b844-9086f9d8f946"). InnerVolumeSpecName "kube-api-access-x88v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.761766 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nfzkg_a6d51bd5-1a8f-402d-80e1-441872e15719/ovn-controller/0.log" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.761863 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.764295 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1021ae3d-46d5-481e-b844-9086f9d8f946" (UID: "1021ae3d-46d5-481e-b844-9086f9d8f946"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.784358 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6897c54f48-kp6tm"] Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.784791 4990 scope.go:117] "RemoveContainer" containerID="f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.795796 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6897c54f48-kp6tm"] Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.816881 4990 scope.go:117] "RemoveContainer" containerID="f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835" Oct 03 10:06:05 crc kubenswrapper[4990]: E1003 10:06:05.817690 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835\": container with ID starting with f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835 not found: ID does not exist" containerID="f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.817719 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835"} err="failed to get container status \"f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835\": rpc error: code = NotFound desc = could not find container \"f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835\": container with ID starting with f57e155fbf1c36816c676ea61bb1b7ff9cfeddf33e18f7099c7add57476b8835 not found: ID does not exist" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.818277 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1021ae3d-46d5-481e-b844-9086f9d8f946" (UID: "1021ae3d-46d5-481e-b844-9086f9d8f946"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.830899 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data" (OuterVolumeSpecName: "config-data") pod "e56fc3e5-d30b-4486-978a-46a13a5657e6" (UID: "e56fc3e5-d30b-4486-978a-46a13a5657e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.852905 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-combined-ca-bundle\") pod \"a6d51bd5-1a8f-402d-80e1-441872e15719\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.852988 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data\") pod \"1021ae3d-46d5-481e-b844-9086f9d8f946\" (UID: \"1021ae3d-46d5-481e-b844-9086f9d8f946\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853052 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gr5p\" (UniqueName: \"kubernetes.io/projected/a6d51bd5-1a8f-402d-80e1-441872e15719-kube-api-access-2gr5p\") pod \"a6d51bd5-1a8f-402d-80e1-441872e15719\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853081 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6d51bd5-1a8f-402d-80e1-441872e15719-scripts\") pod \"a6d51bd5-1a8f-402d-80e1-441872e15719\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853129 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-combined-ca-bundle\") pod \"e56fc3e5-d30b-4486-978a-46a13a5657e6\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853155 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-ovn-controller-tls-certs\") pod \"a6d51bd5-1a8f-402d-80e1-441872e15719\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853215 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data-custom\") pod \"e56fc3e5-d30b-4486-978a-46a13a5657e6\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853241 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run\") pod \"a6d51bd5-1a8f-402d-80e1-441872e15719\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853290 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-log-ovn\") pod \"a6d51bd5-1a8f-402d-80e1-441872e15719\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853312 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run-ovn\") pod \"a6d51bd5-1a8f-402d-80e1-441872e15719\" (UID: \"a6d51bd5-1a8f-402d-80e1-441872e15719\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853362 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m87l\" (UniqueName: \"kubernetes.io/projected/e56fc3e5-d30b-4486-978a-46a13a5657e6-kube-api-access-5m87l\") pod \"e56fc3e5-d30b-4486-978a-46a13a5657e6\" (UID: \"e56fc3e5-d30b-4486-978a-46a13a5657e6\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.853573 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run" (OuterVolumeSpecName: "var-run") pod "a6d51bd5-1a8f-402d-80e1-441872e15719" (UID: "a6d51bd5-1a8f-402d-80e1-441872e15719"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.854155 4990 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.854177 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x88v7\" (UniqueName: \"kubernetes.io/projected/1021ae3d-46d5-481e-b844-9086f9d8f946-kube-api-access-x88v7\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.854192 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.854202 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.854213 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.854223 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e56fc3e5-d30b-4486-978a-46a13a5657e6-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.854232 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1021ae3d-46d5-481e-b844-9086f9d8f946-logs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.855297 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a6d51bd5-1a8f-402d-80e1-441872e15719" (UID: "a6d51bd5-1a8f-402d-80e1-441872e15719"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.855928 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a6d51bd5-1a8f-402d-80e1-441872e15719" (UID: "a6d51bd5-1a8f-402d-80e1-441872e15719"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.856574 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d51bd5-1a8f-402d-80e1-441872e15719-scripts" (OuterVolumeSpecName: "scripts") pod "a6d51bd5-1a8f-402d-80e1-441872e15719" (UID: "a6d51bd5-1a8f-402d-80e1-441872e15719"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.858096 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e56fc3e5-d30b-4486-978a-46a13a5657e6" (UID: "e56fc3e5-d30b-4486-978a-46a13a5657e6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.858253 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d51bd5-1a8f-402d-80e1-441872e15719-kube-api-access-2gr5p" (OuterVolumeSpecName: "kube-api-access-2gr5p") pod "a6d51bd5-1a8f-402d-80e1-441872e15719" (UID: "a6d51bd5-1a8f-402d-80e1-441872e15719"). InnerVolumeSpecName "kube-api-access-2gr5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.862054 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e56fc3e5-d30b-4486-978a-46a13a5657e6-kube-api-access-5m87l" (OuterVolumeSpecName: "kube-api-access-5m87l") pod "e56fc3e5-d30b-4486-978a-46a13a5657e6" (UID: "e56fc3e5-d30b-4486-978a-46a13a5657e6"). InnerVolumeSpecName "kube-api-access-5m87l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.878795 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6d51bd5-1a8f-402d-80e1-441872e15719" (UID: "a6d51bd5-1a8f-402d-80e1-441872e15719"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.885931 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e56fc3e5-d30b-4486-978a-46a13a5657e6" (UID: "e56fc3e5-d30b-4486-978a-46a13a5657e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.899498 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data" (OuterVolumeSpecName: "config-data") pod "1021ae3d-46d5-481e-b844-9086f9d8f946" (UID: "1021ae3d-46d5-481e-b844-9086f9d8f946"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.910546 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.919002 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "a6d51bd5-1a8f-402d-80e1-441872e15719" (UID: "a6d51bd5-1a8f-402d-80e1-441872e15719"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.955613 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nfkx\" (UniqueName: \"kubernetes.io/projected/85167add-e116-4e56-950b-fe0a6a553732-kube-api-access-5nfkx\") pod \"85167add-e116-4e56-950b-fe0a6a553732\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.955763 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-combined-ca-bundle\") pod \"85167add-e116-4e56-950b-fe0a6a553732\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.955811 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-config-data\") pod \"85167add-e116-4e56-950b-fe0a6a553732\" (UID: \"85167add-e116-4e56-950b-fe0a6a553732\") " Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956241 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956266 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1021ae3d-46d5-481e-b844-9086f9d8f946-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956280 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gr5p\" (UniqueName: \"kubernetes.io/projected/a6d51bd5-1a8f-402d-80e1-441872e15719-kube-api-access-2gr5p\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956297 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6d51bd5-1a8f-402d-80e1-441872e15719-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956308 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956322 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6d51bd5-1a8f-402d-80e1-441872e15719-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956335 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e56fc3e5-d30b-4486-978a-46a13a5657e6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956345 4990 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956356 4990 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6d51bd5-1a8f-402d-80e1-441872e15719-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.956368 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m87l\" (UniqueName: \"kubernetes.io/projected/e56fc3e5-d30b-4486-978a-46a13a5657e6-kube-api-access-5m87l\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.959781 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85167add-e116-4e56-950b-fe0a6a553732-kube-api-access-5nfkx" (OuterVolumeSpecName: "kube-api-access-5nfkx") pod "85167add-e116-4e56-950b-fe0a6a553732" (UID: "85167add-e116-4e56-950b-fe0a6a553732"). InnerVolumeSpecName "kube-api-access-5nfkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:05 crc kubenswrapper[4990]: I1003 10:06:05.979414 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85167add-e116-4e56-950b-fe0a6a553732" (UID: "85167add-e116-4e56-950b-fe0a6a553732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:05.986152 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-config-data" (OuterVolumeSpecName: "config-data") pod "85167add-e116-4e56-950b-fe0a6a553732" (UID: "85167add-e116-4e56-950b-fe0a6a553732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.058227 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nfkx\" (UniqueName: \"kubernetes.io/projected/85167add-e116-4e56-950b-fe0a6a553732-kube-api-access-5nfkx\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.058293 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.058307 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85167add-e116-4e56-950b-fe0a6a553732-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.173095 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b23a7883-8397-4262-a891-916de94739fd/ovn-northd/0.log" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.173169 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b23a7883-8397-4262-a891-916de94739fd","Type":"ContainerDied","Data":"f27f94a578be160c2c3eb32cf78425c6c1a79bdc65cb119b96898f5a29032847"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.173212 4990 scope.go:117] "RemoveContainer" containerID="6b3ab8b29ea9f1b7b6e8f40edcf60f82f96ba6bf30e0f0a4afbe62bd168e8f7a" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.173313 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.184589 4990 generic.go:334] "Generic (PLEG): container finished" podID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerID="a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793" exitCode=0 Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.184642 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-64b566fdb9-7b8mq" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.184909 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64b566fdb9-7b8mq" event={"ID":"1021ae3d-46d5-481e-b844-9086f9d8f946","Type":"ContainerDied","Data":"a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.185026 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-64b566fdb9-7b8mq" event={"ID":"1021ae3d-46d5-481e-b844-9086f9d8f946","Type":"ContainerDied","Data":"3570d5cd09268ffa0be273b30970ac5d32e3b12e2c307a4fcbde0c24c3d8c6f3"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.192892 4990 generic.go:334] "Generic (PLEG): container finished" podID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerID="bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7" exitCode=0 Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.192975 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" event={"ID":"e56fc3e5-d30b-4486-978a-46a13a5657e6","Type":"ContainerDied","Data":"bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.193005 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" event={"ID":"e56fc3e5-d30b-4486-978a-46a13a5657e6","Type":"ContainerDied","Data":"48515689fb9bb8d1abc710cdbd3893111b9a1c5d49a0cce9338cc76a6c2b0c8f"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.193074 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8499569686-hgsxg" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.199331 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nfzkg_a6d51bd5-1a8f-402d-80e1-441872e15719/ovn-controller/0.log" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.199374 4990 generic.go:334] "Generic (PLEG): container finished" podID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerID="626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce" exitCode=137 Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.199421 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg" event={"ID":"a6d51bd5-1a8f-402d-80e1-441872e15719","Type":"ContainerDied","Data":"626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.199445 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfzkg" event={"ID":"a6d51bd5-1a8f-402d-80e1-441872e15719","Type":"ContainerDied","Data":"d7222243b011d34e7019fc28eec4f33e7fd3717826982e1e441a5496ec8778e8"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.199500 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfzkg" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.204136 4990 generic.go:334] "Generic (PLEG): container finished" podID="85167add-e116-4e56-950b-fe0a6a553732" containerID="f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e" exitCode=0 Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.204202 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"85167add-e116-4e56-950b-fe0a6a553732","Type":"ContainerDied","Data":"f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.204261 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"85167add-e116-4e56-950b-fe0a6a553732","Type":"ContainerDied","Data":"b0dc24860cca9dcf1728b502d556e1fd044b3a4223fd1b3ee4e4491f0a3bd6fd"} Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.204465 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.209201 4990 scope.go:117] "RemoveContainer" containerID="7871e236e74cb6ae1f5cad66ad4b89c2125e25150e40edb22d50c75bed041cb2" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.226828 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-64b566fdb9-7b8mq"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.232741 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-64b566fdb9-7b8mq"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.244088 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.255519 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.263488 4990 scope.go:117] "RemoveContainer" containerID="a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.265210 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nfzkg"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.272382 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nfzkg"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.281031 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8499569686-hgsxg"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.284625 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-8499569686-hgsxg"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.288723 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.289658 4990 scope.go:117] "RemoveContainer" containerID="9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.292490 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.311372 4990 scope.go:117] "RemoveContainer" containerID="a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793" Oct 03 10:06:06 crc kubenswrapper[4990]: E1003 10:06:06.311898 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793\": container with ID starting with a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793 not found: ID does not exist" containerID="a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.311940 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793"} err="failed to get container status \"a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793\": rpc error: code = NotFound desc = could not find container \"a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793\": container with ID starting with a6369013aa001c224b535dbfc0d18bda2288f5b6528cdf8295d2a050892fb793 not found: ID does not exist" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.311970 4990 scope.go:117] "RemoveContainer" containerID="9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb" Oct 03 10:06:06 crc kubenswrapper[4990]: E1003 10:06:06.312247 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb\": container with ID starting with 9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb not found: ID does not exist" containerID="9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.312273 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb"} err="failed to get container status \"9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb\": rpc error: code = NotFound desc = could not find container \"9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb\": container with ID starting with 9047bcf2d59fa4bfed7b44575efbf6f23f64cc38418e44d38e955a3d9b6acdbb not found: ID does not exist" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.312289 4990 scope.go:117] "RemoveContainer" containerID="bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.337894 4990 scope.go:117] "RemoveContainer" containerID="09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.359841 4990 scope.go:117] "RemoveContainer" containerID="bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7" Oct 03 10:06:06 crc kubenswrapper[4990]: E1003 10:06:06.360416 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7\": container with ID starting with bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7 not found: ID does not exist" containerID="bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.360470 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7"} err="failed to get container status \"bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7\": rpc error: code = NotFound desc = could not find container \"bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7\": container with ID starting with bc55f1a26c35d60a7dfc861a4b16a7dcfd7f35a2eaad58742b6cf26b2240c4b7 not found: ID does not exist" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.360500 4990 scope.go:117] "RemoveContainer" containerID="09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83" Oct 03 10:06:06 crc kubenswrapper[4990]: E1003 10:06:06.360775 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83\": container with ID starting with 09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83 not found: ID does not exist" containerID="09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.360807 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83"} err="failed to get container status \"09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83\": rpc error: code = NotFound desc = could not find container \"09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83\": container with ID starting with 09be646b857af56fcb9c33442a27a8179bfcb11530bccfca340298179b4ffe83 not found: ID does not exist" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.360826 4990 scope.go:117] "RemoveContainer" containerID="626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.381490 4990 scope.go:117] "RemoveContainer" containerID="626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce" Oct 03 10:06:06 crc kubenswrapper[4990]: E1003 10:06:06.382101 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce\": container with ID starting with 626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce not found: ID does not exist" containerID="626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.382167 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce"} err="failed to get container status \"626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce\": rpc error: code = NotFound desc = could not find container \"626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce\": container with ID starting with 626850f6fa04eab12054d9e9c4508dd5ef262b097ea6a93d784f7ddd4f5c7dce not found: ID does not exist" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.382197 4990 scope.go:117] "RemoveContainer" containerID="f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.401557 4990 scope.go:117] "RemoveContainer" containerID="f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e" Oct 03 10:06:06 crc kubenswrapper[4990]: E1003 10:06:06.402256 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e\": container with ID starting with f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e not found: ID does not exist" containerID="f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.402296 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e"} err="failed to get container status \"f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e\": rpc error: code = NotFound desc = could not find container \"f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e\": container with ID starting with f7e3cf53f984c0afe5c9b58f365e1d5fe8fa94710fd5040bd76a4e9a9d7ae56e not found: ID does not exist" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.886257 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" path="/var/lib/kubelet/pods/1021ae3d-46d5-481e-b844-9086f9d8f946/volumes" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.887117 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51461d28-e850-4ba3-8f27-0252b51903f1" path="/var/lib/kubelet/pods/51461d28-e850-4ba3-8f27-0252b51903f1/volumes" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.887750 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85167add-e116-4e56-950b-fe0a6a553732" path="/var/lib/kubelet/pods/85167add-e116-4e56-950b-fe0a6a553732/volumes" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.889126 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" path="/var/lib/kubelet/pods/8fe31a60-7e5f-40a8-acf3-d7a17c210e74/volumes" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.890095 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" path="/var/lib/kubelet/pods/a6d51bd5-1a8f-402d-80e1-441872e15719/volumes" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.891388 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b23a7883-8397-4262-a891-916de94739fd" path="/var/lib/kubelet/pods/b23a7883-8397-4262-a891-916de94739fd/volumes" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.892251 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2d5088c-5854-4bee-9e3c-8198d4b7d377" path="/var/lib/kubelet/pods/b2d5088c-5854-4bee-9e3c-8198d4b7d377/volumes" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.893103 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" path="/var/lib/kubelet/pods/e56fc3e5-d30b-4486-978a-46a13a5657e6/volumes" Oct 03 10:06:06 crc kubenswrapper[4990]: I1003 10:06:06.894795 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" path="/var/lib/kubelet/pods/f6624a04-5ca4-4651-a91e-0a67f97c51b5/volumes" Oct 03 10:06:07 crc kubenswrapper[4990]: I1003 10:06:07.822421 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 10:06:07 crc kubenswrapper[4990]: I1003 10:06:07.822419 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.205:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.235667 4990 generic.go:334] "Generic (PLEG): container finished" podID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerID="f726157640cde355a6b4fde9ac87cd11f712f5e45c77f82242ffce8ed67bd078" exitCode=0 Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.235733 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerDied","Data":"f726157640cde355a6b4fde9ac87cd11f712f5e45c77f82242ffce8ed67bd078"} Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.425878 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.493179 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-sg-core-conf-yaml\") pod \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.493236 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-run-httpd\") pod \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.493280 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-config-data\") pod \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.493320 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-scripts\") pod \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.493348 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4xl4\" (UniqueName: \"kubernetes.io/projected/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-kube-api-access-z4xl4\") pod \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.493395 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-log-httpd\") pod \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.493420 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-ceilometer-tls-certs\") pod \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.493459 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-combined-ca-bundle\") pod \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\" (UID: \"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5\") " Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.494815 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" (UID: "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.495521 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" (UID: "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.498999 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-kube-api-access-z4xl4" (OuterVolumeSpecName: "kube-api-access-z4xl4") pod "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" (UID: "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5"). InnerVolumeSpecName "kube-api-access-z4xl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.499206 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-scripts" (OuterVolumeSpecName: "scripts") pod "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" (UID: "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.530166 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" (UID: "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.551198 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" (UID: "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.556535 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" (UID: "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.589605 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-config-data" (OuterVolumeSpecName: "config-data") pod "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" (UID: "6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.596028 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.596053 4990 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.596064 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.596073 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.596081 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.596092 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.596100 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:08 crc kubenswrapper[4990]: I1003 10:06:08.596109 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4xl4\" (UniqueName: \"kubernetes.io/projected/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5-kube-api-access-z4xl4\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:09 crc kubenswrapper[4990]: I1003 10:06:09.248764 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5","Type":"ContainerDied","Data":"6d7649f24ac5f6250b76e795a72b31471b440e7aac2b79862868b1216833db8e"} Oct 03 10:06:09 crc kubenswrapper[4990]: I1003 10:06:09.248826 4990 scope.go:117] "RemoveContainer" containerID="9e36ab3093e7df92e341c1eb14c639feff278ab0a3155acc7a3df5e7970a5bb6" Oct 03 10:06:09 crc kubenswrapper[4990]: I1003 10:06:09.248979 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 10:06:09 crc kubenswrapper[4990]: I1003 10:06:09.279857 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:06:09 crc kubenswrapper[4990]: I1003 10:06:09.284753 4990 scope.go:117] "RemoveContainer" containerID="77511c0f1dafbe0f6a7ce5a4f15e45796b6eadf3587bf8ee1730bb9c4a726c6d" Oct 03 10:06:09 crc kubenswrapper[4990]: I1003 10:06:09.287791 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 10:06:09 crc kubenswrapper[4990]: I1003 10:06:09.308582 4990 scope.go:117] "RemoveContainer" containerID="f726157640cde355a6b4fde9ac87cd11f712f5e45c77f82242ffce8ed67bd078" Oct 03 10:06:09 crc kubenswrapper[4990]: I1003 10:06:09.337742 4990 scope.go:117] "RemoveContainer" containerID="0cf5ac29746ce882d6ae1c7168250fbf34eb77ced233198b98d8e39f0ab37bd4" Oct 03 10:06:09 crc kubenswrapper[4990]: E1003 10:06:09.760598 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:09 crc kubenswrapper[4990]: E1003 10:06:09.761152 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:09 crc kubenswrapper[4990]: E1003 10:06:09.761679 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:09 crc kubenswrapper[4990]: E1003 10:06:09.761785 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:06:09 crc kubenswrapper[4990]: E1003 10:06:09.761992 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:09 crc kubenswrapper[4990]: E1003 10:06:09.764472 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:09 crc kubenswrapper[4990]: E1003 10:06:09.767383 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:09 crc kubenswrapper[4990]: E1003 10:06:09.767430 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:06:10 crc kubenswrapper[4990]: I1003 10:06:10.890998 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" path="/var/lib/kubelet/pods/6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5/volumes" Oct 03 10:06:11 crc kubenswrapper[4990]: E1003 10:06:11.847120 4990 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 03 10:06:11 crc kubenswrapper[4990]: E1003 10:06:11.847212 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:06:27.847191628 +0000 UTC m=+1369.643823485 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-httpd-config" not found Oct 03 10:06:11 crc kubenswrapper[4990]: E1003 10:06:11.847902 4990 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 03 10:06:11 crc kubenswrapper[4990]: E1003 10:06:11.847940 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config podName:44d77d08-ad9c-4524-8b12-3d9d204aaf1c nodeName:}" failed. No retries permitted until 2025-10-03 10:06:27.847927967 +0000 UTC m=+1369.644559824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config") pod "neutron-6778f9d745-ft6gs" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c") : secret "neutron-config" not found Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.301026 4990 generic.go:334] "Generic (PLEG): container finished" podID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerID="c1e969493c2c4af3c85759ecb6094d97806d479c080aac159e4825c9e30be4a7" exitCode=0 Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.301100 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6778f9d745-ft6gs" event={"ID":"44d77d08-ad9c-4524-8b12-3d9d204aaf1c","Type":"ContainerDied","Data":"c1e969493c2c4af3c85759ecb6094d97806d479c080aac159e4825c9e30be4a7"} Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.434971 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.486913 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config\") pod \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.486962 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-combined-ca-bundle\") pod \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.486995 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-ovndb-tls-certs\") pod \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.487018 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-internal-tls-certs\") pod \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.487046 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config\") pod \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.487074 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgcxp\" (UniqueName: \"kubernetes.io/projected/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-kube-api-access-wgcxp\") pod \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.487092 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-public-tls-certs\") pod \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\" (UID: \"44d77d08-ad9c-4524-8b12-3d9d204aaf1c\") " Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.493411 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "44d77d08-ad9c-4524-8b12-3d9d204aaf1c" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.495264 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-kube-api-access-wgcxp" (OuterVolumeSpecName: "kube-api-access-wgcxp") pod "44d77d08-ad9c-4524-8b12-3d9d204aaf1c" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c"). InnerVolumeSpecName "kube-api-access-wgcxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.552685 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "44d77d08-ad9c-4524-8b12-3d9d204aaf1c" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.555066 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44d77d08-ad9c-4524-8b12-3d9d204aaf1c" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.566353 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44d77d08-ad9c-4524-8b12-3d9d204aaf1c" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.570875 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config" (OuterVolumeSpecName: "config") pod "44d77d08-ad9c-4524-8b12-3d9d204aaf1c" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.588498 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.588555 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.588568 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgcxp\" (UniqueName: \"kubernetes.io/projected/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-kube-api-access-wgcxp\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.588583 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.588594 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.588606 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.589954 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "44d77d08-ad9c-4524-8b12-3d9d204aaf1c" (UID: "44d77d08-ad9c-4524-8b12-3d9d204aaf1c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:06:12 crc kubenswrapper[4990]: I1003 10:06:12.690358 4990 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/44d77d08-ad9c-4524-8b12-3d9d204aaf1c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:13 crc kubenswrapper[4990]: I1003 10:06:13.316100 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6778f9d745-ft6gs" event={"ID":"44d77d08-ad9c-4524-8b12-3d9d204aaf1c","Type":"ContainerDied","Data":"517dbb72f8123f4ff06aded4a73697fe77e5011aea1784f0e1d1562405d18e44"} Oct 03 10:06:13 crc kubenswrapper[4990]: I1003 10:06:13.316447 4990 scope.go:117] "RemoveContainer" containerID="275691f37f6f67a2de209a5a1eaf8c7ee5473a0d43e2287e05d49c37f8c3fa41" Oct 03 10:06:13 crc kubenswrapper[4990]: I1003 10:06:13.316195 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6778f9d745-ft6gs" Oct 03 10:06:13 crc kubenswrapper[4990]: I1003 10:06:13.343178 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6778f9d745-ft6gs"] Oct 03 10:06:13 crc kubenswrapper[4990]: I1003 10:06:13.347726 4990 scope.go:117] "RemoveContainer" containerID="c1e969493c2c4af3c85759ecb6094d97806d479c080aac159e4825c9e30be4a7" Oct 03 10:06:13 crc kubenswrapper[4990]: I1003 10:06:13.349579 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6778f9d745-ft6gs"] Oct 03 10:06:14 crc kubenswrapper[4990]: E1003 10:06:14.779980 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:14 crc kubenswrapper[4990]: E1003 10:06:14.780226 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:14 crc kubenswrapper[4990]: E1003 10:06:14.782797 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:14 crc kubenswrapper[4990]: E1003 10:06:14.782997 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:14 crc kubenswrapper[4990]: E1003 10:06:14.785132 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:14 crc kubenswrapper[4990]: E1003 10:06:14.785172 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:06:14 crc kubenswrapper[4990]: E1003 10:06:14.785687 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:14 crc kubenswrapper[4990]: E1003 10:06:14.785715 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:06:14 crc kubenswrapper[4990]: I1003 10:06:14.880910 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" path="/var/lib/kubelet/pods/44d77d08-ad9c-4524-8b12-3d9d204aaf1c/volumes" Oct 03 10:06:19 crc kubenswrapper[4990]: E1003 10:06:19.760049 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:19 crc kubenswrapper[4990]: E1003 10:06:19.760654 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:19 crc kubenswrapper[4990]: E1003 10:06:19.761049 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:19 crc kubenswrapper[4990]: E1003 10:06:19.761184 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:19 crc kubenswrapper[4990]: E1003 10:06:19.761220 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:06:19 crc kubenswrapper[4990]: E1003 10:06:19.762332 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:19 crc kubenswrapper[4990]: E1003 10:06:19.763390 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:19 crc kubenswrapper[4990]: E1003 10:06:19.763428 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.214240 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t7qwk"] Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215111 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afea80e6-894d-41cd-b107-926d012e9f35" containerName="glance-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215130 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="afea80e6-894d-41cd-b107-926d012e9f35" containerName="glance-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215154 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c15976-1c83-43b6-8077-6af8ecc010dc" containerName="init" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215161 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c15976-1c83-43b6-8077-6af8ecc010dc" containerName="init" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215183 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="ovn-northd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215190 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="ovn-northd" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215200 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="proxy-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215209 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="proxy-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215221 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215229 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-api" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215243 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerName="ovsdbserver-nb" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215251 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerName="ovsdbserver-nb" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215261 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerName="neutron-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215270 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerName="neutron-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215280 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a22247-2803-4910-a44a-9ccba673c2cf" containerName="mysql-bootstrap" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215288 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a22247-2803-4910-a44a-9ccba673c2cf" containerName="mysql-bootstrap" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215302 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerName="placement-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215309 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerName="placement-api" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215321 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="ceilometer-notification-agent" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215328 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="ceilometer-notification-agent" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215345 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4064799a-3601-4426-a225-151729d11c97" containerName="cinder-scheduler" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215352 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4064799a-3601-4426-a225-151729d11c97" containerName="cinder-scheduler" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215359 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85167add-e116-4e56-950b-fe0a6a553732" containerName="nova-cell0-conductor-conductor" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215365 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="85167add-e116-4e56-950b-fe0a6a553732" containerName="nova-cell0-conductor-conductor" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215374 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215380 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215391 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c4e608-45c9-447a-802d-dc405aac76e4" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215397 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c4e608-45c9-447a-802d-dc405aac76e4" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215405 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerName="neutron-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215412 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerName="neutron-api" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215420 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerName="placement-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215427 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerName="placement-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215435 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" containerName="mysql-bootstrap" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215442 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" containerName="mysql-bootstrap" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215454 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" containerName="galera" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215461 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" containerName="galera" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215469 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215475 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215487 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afea80e6-894d-41cd-b107-926d012e9f35" containerName="glance-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215494 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="afea80e6-894d-41cd-b107-926d012e9f35" containerName="glance-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215502 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerName="barbican-keystone-listener" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215514 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerName="barbican-keystone-listener" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215528 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825c3741-d390-4a7c-b3a6-50e268fbe712" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215535 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="825c3741-d390-4a7c-b3a6-50e268fbe712" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215564 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" containerName="setup-container" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215571 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" containerName="setup-container" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215582 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="ceilometer-central-agent" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215589 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="ceilometer-central-agent" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215599 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4064799a-3601-4426-a225-151729d11c97" containerName="probe" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215606 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4064799a-3601-4426-a225-151729d11c97" containerName="probe" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215618 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-metadata" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215625 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-metadata" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215634 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerName="barbican-worker" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215643 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerName="barbican-worker" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215650 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215657 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215665 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215675 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215687 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d5088c-5854-4bee-9e3c-8198d4b7d377" containerName="keystone-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215694 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d5088c-5854-4bee-9e3c-8198d4b7d377" containerName="keystone-api" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215704 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerName="barbican-worker-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215711 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerName="barbican-worker-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215727 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215733 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215744 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a22247-2803-4910-a44a-9ccba673c2cf" containerName="galera" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215750 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a22247-2803-4910-a44a-9ccba673c2cf" containerName="galera" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215759 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463798bd-8799-4206-bf0c-b2f62f1fc1d0" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215765 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="463798bd-8799-4206-bf0c-b2f62f1fc1d0" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215774 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="sg-core" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215781 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="sg-core" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215791 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215799 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215807 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerName="glance-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215814 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerName="glance-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215827 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerName="glance-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215834 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerName="glance-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215844 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230b4581-35e6-4c97-9f63-73e70624bf5c" containerName="memcached" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215850 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="230b4581-35e6-4c97-9f63-73e70624bf5c" containerName="memcached" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215857 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac157dc7-6df6-4f4f-ba65-c85b58f78fff" containerName="nova-cell1-conductor-conductor" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215863 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac157dc7-6df6-4f4f-ba65-c85b58f78fff" containerName="nova-cell1-conductor-conductor" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215872 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerName="barbican-keystone-listener-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215878 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerName="barbican-keystone-listener-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215892 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215905 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215914 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" containerName="rabbitmq" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215920 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" containerName="rabbitmq" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215928 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51461d28-e850-4ba3-8f27-0252b51903f1" containerName="setup-container" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215936 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51461d28-e850-4ba3-8f27-0252b51903f1" containerName="setup-container" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215946 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8359ce-901e-400a-926a-b3060c2dc789" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215954 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8359ce-901e-400a-926a-b3060c2dc789" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215967 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215976 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-log" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.215987 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fad2c33-3d08-4ab8-91b2-dea27b8dc05c" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.215994 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fad2c33-3d08-4ab8-91b2-dea27b8dc05c" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216003 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" containerName="nova-scheduler-scheduler" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216010 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" containerName="nova-scheduler-scheduler" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216017 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56fcd909-29b5-472d-8007-84fc511ac818" containerName="kube-state-metrics" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216024 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="56fcd909-29b5-472d-8007-84fc511ac818" containerName="kube-state-metrics" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216034 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216041 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216051 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="ovsdbserver-sb" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216059 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="ovsdbserver-sb" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216070 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-server" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216079 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-server" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216087 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b89027-cf5d-4807-adc3-b4915304f1f2" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216095 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b89027-cf5d-4807-adc3-b4915304f1f2" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216110 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216118 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216128 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64d7262-ab55-4d88-bb9c-02825e07721a" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216136 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64d7262-ab55-4d88-bb9c-02825e07721a" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216149 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c15976-1c83-43b6-8077-6af8ecc010dc" containerName="dnsmasq-dns" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216156 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c15976-1c83-43b6-8077-6af8ecc010dc" containerName="dnsmasq-dns" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216165 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51461d28-e850-4ba3-8f27-0252b51903f1" containerName="rabbitmq" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216174 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51461d28-e850-4ba3-8f27-0252b51903f1" containerName="rabbitmq" Oct 03 10:06:22 crc kubenswrapper[4990]: E1003 10:06:22.216183 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216191 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216351 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="afea80e6-894d-41cd-b107-926d012e9f35" containerName="glance-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216365 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216372 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerName="barbican-worker" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216381 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-server" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216389 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerName="glance-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216401 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fad2c33-3d08-4ab8-91b2-dea27b8dc05c" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216409 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="51461d28-e850-4ba3-8f27-0252b51903f1" containerName="rabbitmq" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216420 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerName="barbican-keystone-listener" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216429 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerName="neutron-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216438 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d77d08-ad9c-4524-8b12-3d9d204aaf1c" containerName="neutron-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216446 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216456 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="41126aba-eb3b-4f29-89ab-29a3ea1addd9" containerName="proxy-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216465 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b89027-cf5d-4807-adc3-b4915304f1f2" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216479 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e56fc3e5-d30b-4486-978a-46a13a5657e6" containerName="barbican-keystone-listener-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216489 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="proxy-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216501 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216517 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d5088c-5854-4bee-9e3c-8198d4b7d377" containerName="keystone-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216524 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1021ae3d-46d5-481e-b844-9086f9d8f946" containerName="barbican-worker-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216547 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe27b2c-9f5a-4687-bff9-8a36d03f8a90" containerName="nova-scheduler-scheduler" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216555 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="ovsdbserver-sb" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216565 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="ovn-northd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216575 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="85167add-e116-4e56-950b-fe0a6a553732" containerName="nova-cell0-conductor-conductor" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216590 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64d7262-ab55-4d88-bb9c-02825e07721a" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216599 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb4021b-c9ef-4b31-864d-d4874b51e47c" containerName="barbican-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216607 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4064799a-3601-4426-a225-151729d11c97" containerName="probe" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216615 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6624a04-5ca4-4651-a91e-0a67f97c51b5" containerName="rabbitmq" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216626 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="971e9963-b7ee-4ee8-872a-2f696bbfdb40" containerName="glance-httpd" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216634 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4064799a-3601-4426-a225-151729d11c97" containerName="cinder-scheduler" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216646 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerName="placement-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216656 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="463798bd-8799-4206-bf0c-b2f62f1fc1d0" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216666 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f251a942-6e8b-4f2e-a6e8-b505e4921b19" containerName="nova-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216673 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216684 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8359ce-901e-400a-926a-b3060c2dc789" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216692 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="230b4581-35e6-4c97-9f63-73e70624bf5c" containerName="memcached" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216702 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f30b2f82-0dc9-4f91-a1a8-8f3c22fc380e" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216713 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="56fcd909-29b5-472d-8007-84fc511ac818" containerName="kube-state-metrics" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216720 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac157dc7-6df6-4f4f-ba65-c85b58f78fff" containerName="nova-cell1-conductor-conductor" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216732 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d51bd5-1a8f-402d-80e1-441872e15719" containerName="ovn-controller" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216744 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216753 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23a7883-8397-4262-a891-916de94739fd" containerName="openstack-network-exporter" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216762 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c4e608-45c9-447a-802d-dc405aac76e4" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216773 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f14bb4e-f980-48fb-bba4-c068419b1975" containerName="placement-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216789 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe31a60-7e5f-40a8-acf3-d7a17c210e74" containerName="galera" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216798 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafd4ca6-6b2d-4c8e-b285-e5b29d2f4507" containerName="nova-metadata-metadata" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216806 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c15976-1c83-43b6-8077-6af8ecc010dc" containerName="dnsmasq-dns" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216816 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="ceilometer-central-agent" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216824 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="sg-core" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216832 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a22247-2803-4910-a44a-9ccba673c2cf" containerName="galera" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216842 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="afea80e6-894d-41cd-b107-926d012e9f35" containerName="glance-log" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216849 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3b6bcb-ae6c-47a2-aa97-46a21b0804c5" containerName="ceilometer-notification-agent" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216857 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaaf61b7-7fcf-40c6-91c4-56ed9720cdf9" containerName="ovsdbserver-nb" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216866 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b682e49-2ca7-4692-b989-28dfbd26163e" containerName="cinder-api" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.216876 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="825c3741-d390-4a7c-b3a6-50e268fbe712" containerName="mariadb-account-delete" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.218080 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.220504 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t7qwk"] Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.235673 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4kn\" (UniqueName: \"kubernetes.io/projected/c47dd420-cb30-4581-b944-77cd4a65b82c-kube-api-access-4l4kn\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.235870 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-catalog-content\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.235923 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-utilities\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.337119 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4kn\" (UniqueName: \"kubernetes.io/projected/c47dd420-cb30-4581-b944-77cd4a65b82c-kube-api-access-4l4kn\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.337218 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-catalog-content\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.337252 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-utilities\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.337836 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-utilities\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.338066 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-catalog-content\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.358419 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4kn\" (UniqueName: \"kubernetes.io/projected/c47dd420-cb30-4581-b944-77cd4a65b82c-kube-api-access-4l4kn\") pod \"redhat-operators-t7qwk\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.544046 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:22 crc kubenswrapper[4990]: I1003 10:06:22.978897 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t7qwk"] Oct 03 10:06:23 crc kubenswrapper[4990]: I1003 10:06:23.423232 4990 generic.go:334] "Generic (PLEG): container finished" podID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerID="c41b5ac998fe38b126154649039837bad3415f1f66edb4446ee6b9eaef199eee" exitCode=0 Oct 03 10:06:23 crc kubenswrapper[4990]: I1003 10:06:23.423276 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7qwk" event={"ID":"c47dd420-cb30-4581-b944-77cd4a65b82c","Type":"ContainerDied","Data":"c41b5ac998fe38b126154649039837bad3415f1f66edb4446ee6b9eaef199eee"} Oct 03 10:06:23 crc kubenswrapper[4990]: I1003 10:06:23.423304 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7qwk" event={"ID":"c47dd420-cb30-4581-b944-77cd4a65b82c","Type":"ContainerStarted","Data":"3ef87ac3be659f70e87db7af90fff211678ae7b17de098e79047c693eb791e09"} Oct 03 10:06:23 crc kubenswrapper[4990]: I1003 10:06:23.425267 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:06:24 crc kubenswrapper[4990]: E1003 10:06:24.759744 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:24 crc kubenswrapper[4990]: E1003 10:06:24.760807 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:24 crc kubenswrapper[4990]: E1003 10:06:24.761458 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 10:06:24 crc kubenswrapper[4990]: E1003 10:06:24.761498 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:06:24 crc kubenswrapper[4990]: E1003 10:06:24.761991 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:24 crc kubenswrapper[4990]: E1003 10:06:24.763248 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:24 crc kubenswrapper[4990]: E1003 10:06:24.764824 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 10:06:24 crc kubenswrapper[4990]: E1003 10:06:24.764873 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-zxxk7" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:06:25 crc kubenswrapper[4990]: I1003 10:06:25.442843 4990 generic.go:334] "Generic (PLEG): container finished" podID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerID="64327114b55bb8954095a6031fb00b73529e2ce884ca127e8be43019bb49daf7" exitCode=0 Oct 03 10:06:25 crc kubenswrapper[4990]: I1003 10:06:25.442903 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7qwk" event={"ID":"c47dd420-cb30-4581-b944-77cd4a65b82c","Type":"ContainerDied","Data":"64327114b55bb8954095a6031fb00b73529e2ce884ca127e8be43019bb49daf7"} Oct 03 10:06:26 crc kubenswrapper[4990]: I1003 10:06:26.456126 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7qwk" event={"ID":"c47dd420-cb30-4581-b944-77cd4a65b82c","Type":"ContainerStarted","Data":"9e33f87f3c3ab85c4a60fc57ebdabfd0cdbd008024f6e737b82bd18df6ce1c30"} Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.137345 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.212544 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-cache\") pod \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.212821 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-lock\") pod \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.213092 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.213218 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24ztd\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-kube-api-access-24ztd\") pod \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.213248 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") pod \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\" (UID: \"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6\") " Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.213429 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-lock" (OuterVolumeSpecName: "lock") pod "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.213451 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-cache" (OuterVolumeSpecName: "cache") pod "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.213814 4990 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-cache\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.213831 4990 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-lock\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.218781 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.219129 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-kube-api-access-24ztd" (OuterVolumeSpecName: "kube-api-access-24ztd") pod "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6"). InnerVolumeSpecName "kube-api-access-24ztd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.219190 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" (UID: "cca92a2a-2e3d-4e52-8ed8-a4dc709915b6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.314651 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.314686 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24ztd\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-kube-api-access-24ztd\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.314698 4990 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.329979 4990 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.416101 4990 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.509545 4990 generic.go:334] "Generic (PLEG): container finished" podID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerID="5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c" exitCode=137 Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.509632 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c"} Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.509661 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"cca92a2a-2e3d-4e52-8ed8-a4dc709915b6","Type":"ContainerDied","Data":"127cc36df918eeae38ea1d8c57b81fbff89e450fc2bd33ee21659f582a24901b"} Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.509679 4990 scope.go:117] "RemoveContainer" containerID="5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.509850 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.525079 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zxxk7_ea0bd28b-825b-4ba5-8838-f3bc695b0613/ovs-vswitchd/0.log" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.534101 4990 generic.go:334] "Generic (PLEG): container finished" podID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" exitCode=137 Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.535181 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zxxk7" event={"ID":"ea0bd28b-825b-4ba5-8838-f3bc695b0613","Type":"ContainerDied","Data":"7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a"} Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.575747 4990 scope.go:117] "RemoveContainer" containerID="540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.583339 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t7qwk" podStartSLOduration=2.757956435 podStartE2EDuration="5.583324086s" podCreationTimestamp="2025-10-03 10:06:22 +0000 UTC" firstStartedPulling="2025-10-03 10:06:23.425030028 +0000 UTC m=+1365.221661885" lastFinishedPulling="2025-10-03 10:06:26.250397669 +0000 UTC m=+1368.047029536" observedRunningTime="2025-10-03 10:06:27.579966331 +0000 UTC m=+1369.376598188" watchObservedRunningTime="2025-10-03 10:06:27.583324086 +0000 UTC m=+1369.379955943" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.632855 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.650335 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.655185 4990 scope.go:117] "RemoveContainer" containerID="80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.680694 4990 scope.go:117] "RemoveContainer" containerID="21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.702233 4990 scope.go:117] "RemoveContainer" containerID="7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.729815 4990 scope.go:117] "RemoveContainer" containerID="9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.749449 4990 scope.go:117] "RemoveContainer" containerID="18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.773846 4990 scope.go:117] "RemoveContainer" containerID="6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.796300 4990 scope.go:117] "RemoveContainer" containerID="d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.820296 4990 scope.go:117] "RemoveContainer" containerID="005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.850021 4990 scope.go:117] "RemoveContainer" containerID="c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.883584 4990 scope.go:117] "RemoveContainer" containerID="751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.940193 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zxxk7_ea0bd28b-825b-4ba5-8838-f3bc695b0613/ovs-vswitchd/0.log" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.941417 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.943322 4990 scope.go:117] "RemoveContainer" containerID="be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.972864 4990 scope.go:117] "RemoveContainer" containerID="e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555" Oct 03 10:06:27 crc kubenswrapper[4990]: I1003 10:06:27.997060 4990 scope.go:117] "RemoveContainer" containerID="92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.031382 4990 scope.go:117] "RemoveContainer" containerID="5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.031898 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c\": container with ID starting with 5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c not found: ID does not exist" containerID="5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.031943 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c"} err="failed to get container status \"5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c\": rpc error: code = NotFound desc = could not find container \"5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c\": container with ID starting with 5bc27c25d831fba8be880d9ae5b350d747108ed4d9629f4c7f8368611c58630c not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.031967 4990 scope.go:117] "RemoveContainer" containerID="540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.032270 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4\": container with ID starting with 540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4 not found: ID does not exist" containerID="540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.032289 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4"} err="failed to get container status \"540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4\": rpc error: code = NotFound desc = could not find container \"540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4\": container with ID starting with 540f6dbfaadf3781817fcbe1eab83cd6ec64d394d1200cfb0353ab2c82d302f4 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.032301 4990 scope.go:117] "RemoveContainer" containerID="80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.032561 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1\": container with ID starting with 80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1 not found: ID does not exist" containerID="80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.032582 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1"} err="failed to get container status \"80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1\": rpc error: code = NotFound desc = could not find container \"80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1\": container with ID starting with 80e002c1fab1102e0c289d87245e33b7701eec0c9130832fe64cb7fa102ca0d1 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.032623 4990 scope.go:117] "RemoveContainer" containerID="21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.032875 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265\": container with ID starting with 21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265 not found: ID does not exist" containerID="21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.032897 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265"} err="failed to get container status \"21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265\": rpc error: code = NotFound desc = could not find container \"21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265\": container with ID starting with 21fda2e8f2079ef48377a3e8f321b84263338603d815770f413ee028f7373265 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.032915 4990 scope.go:117] "RemoveContainer" containerID="7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.033342 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574\": container with ID starting with 7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574 not found: ID does not exist" containerID="7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.033393 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574"} err="failed to get container status \"7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574\": rpc error: code = NotFound desc = could not find container \"7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574\": container with ID starting with 7e8ac4df7c6745196687a5435ce3a6fedcf0c5c7283e8309244498c9a84e0574 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.033410 4990 scope.go:117] "RemoveContainer" containerID="9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.033890 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086\": container with ID starting with 9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086 not found: ID does not exist" containerID="9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.033918 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086"} err="failed to get container status \"9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086\": rpc error: code = NotFound desc = could not find container \"9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086\": container with ID starting with 9b392154f1f7e1d7c632fb2c7fdca64a0c224f8a6534181bacc9f33d46b40086 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.033964 4990 scope.go:117] "RemoveContainer" containerID="18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.034358 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54\": container with ID starting with 18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54 not found: ID does not exist" containerID="18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.034399 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54"} err="failed to get container status \"18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54\": rpc error: code = NotFound desc = could not find container \"18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54\": container with ID starting with 18865fd5e29be3183461c52f27264daeca96249e4a4522faacfe484bef926c54 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.034414 4990 scope.go:117] "RemoveContainer" containerID="6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.034646 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5\": container with ID starting with 6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5 not found: ID does not exist" containerID="6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.034680 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5"} err="failed to get container status \"6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5\": rpc error: code = NotFound desc = could not find container \"6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5\": container with ID starting with 6d60b25edf0f287dcf5474781be0a4fb5253a9faee1d24de38d0e7f9cd08aee5 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.034692 4990 scope.go:117] "RemoveContainer" containerID="d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.034899 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157\": container with ID starting with d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157 not found: ID does not exist" containerID="d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.034934 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157"} err="failed to get container status \"d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157\": rpc error: code = NotFound desc = could not find container \"d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157\": container with ID starting with d2d1626a570efe80420326f196dc057d78e1707f308e9317f62b39826efde157 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.034947 4990 scope.go:117] "RemoveContainer" containerID="005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.035155 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4\": container with ID starting with 005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4 not found: ID does not exist" containerID="005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.035175 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4"} err="failed to get container status \"005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4\": rpc error: code = NotFound desc = could not find container \"005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4\": container with ID starting with 005331767e6dfb3ca5c53454d449ec0f3ff45bf0c8e99df8f883153e97758cb4 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.035186 4990 scope.go:117] "RemoveContainer" containerID="c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.036435 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab\": container with ID starting with c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab not found: ID does not exist" containerID="c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.036481 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab"} err="failed to get container status \"c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab\": rpc error: code = NotFound desc = could not find container \"c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab\": container with ID starting with c2372cfcf16206945516f58d4587847e6df2346b31aecfe92d74638ebb5d5cab not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.036495 4990 scope.go:117] "RemoveContainer" containerID="751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.036735 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7\": container with ID starting with 751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7 not found: ID does not exist" containerID="751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.036755 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7"} err="failed to get container status \"751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7\": rpc error: code = NotFound desc = could not find container \"751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7\": container with ID starting with 751afc5346fcc8ff381e604dca4f8dfaf22a06a670a445f954ab151f47765fe7 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.036805 4990 scope.go:117] "RemoveContainer" containerID="be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.037065 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182\": container with ID starting with be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182 not found: ID does not exist" containerID="be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.037087 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182"} err="failed to get container status \"be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182\": rpc error: code = NotFound desc = could not find container \"be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182\": container with ID starting with be4e5e890caa4fe3b48eb3f151b1fa2df542fca573a554286515d9071769a182 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.037121 4990 scope.go:117] "RemoveContainer" containerID="e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.037351 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555\": container with ID starting with e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555 not found: ID does not exist" containerID="e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.037371 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555"} err="failed to get container status \"e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555\": rpc error: code = NotFound desc = could not find container \"e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555\": container with ID starting with e867b355413d2b95643f6e8c0c7699c52e3aa1a57bb8701e36210c25a8905555 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.037382 4990 scope.go:117] "RemoveContainer" containerID="92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301" Oct 03 10:06:28 crc kubenswrapper[4990]: E1003 10:06:28.037644 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301\": container with ID starting with 92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301 not found: ID does not exist" containerID="92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.037669 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301"} err="failed to get container status \"92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301\": rpc error: code = NotFound desc = could not find container \"92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301\": container with ID starting with 92d35b08ce7b9e16e43ba1bdba41380b67b53e13a83af4ba5d7179047a055301 not found: ID does not exist" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.125230 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t82d2\" (UniqueName: \"kubernetes.io/projected/ea0bd28b-825b-4ba5-8838-f3bc695b0613-kube-api-access-t82d2\") pod \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.125806 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-etc-ovs\") pod \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.125881 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-log\") pod \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.125920 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-lib\") pod \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.125907 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "ea0bd28b-825b-4ba5-8838-f3bc695b0613" (UID: "ea0bd28b-825b-4ba5-8838-f3bc695b0613"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.125944 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-lib" (OuterVolumeSpecName: "var-lib") pod "ea0bd28b-825b-4ba5-8838-f3bc695b0613" (UID: "ea0bd28b-825b-4ba5-8838-f3bc695b0613"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.125993 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-log" (OuterVolumeSpecName: "var-log") pod "ea0bd28b-825b-4ba5-8838-f3bc695b0613" (UID: "ea0bd28b-825b-4ba5-8838-f3bc695b0613"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.126146 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea0bd28b-825b-4ba5-8838-f3bc695b0613-scripts\") pod \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.127190 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea0bd28b-825b-4ba5-8838-f3bc695b0613-scripts" (OuterVolumeSpecName: "scripts") pod "ea0bd28b-825b-4ba5-8838-f3bc695b0613" (UID: "ea0bd28b-825b-4ba5-8838-f3bc695b0613"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.127291 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-run\") pod \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\" (UID: \"ea0bd28b-825b-4ba5-8838-f3bc695b0613\") " Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.127364 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-run" (OuterVolumeSpecName: "var-run") pod "ea0bd28b-825b-4ba5-8838-f3bc695b0613" (UID: "ea0bd28b-825b-4ba5-8838-f3bc695b0613"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.127788 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea0bd28b-825b-4ba5-8838-f3bc695b0613-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.127811 4990 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.127824 4990 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.127836 4990 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-log\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.127848 4990 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ea0bd28b-825b-4ba5-8838-f3bc695b0613-var-lib\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.132724 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0bd28b-825b-4ba5-8838-f3bc695b0613-kube-api-access-t82d2" (OuterVolumeSpecName: "kube-api-access-t82d2") pod "ea0bd28b-825b-4ba5-8838-f3bc695b0613" (UID: "ea0bd28b-825b-4ba5-8838-f3bc695b0613"). InnerVolumeSpecName "kube-api-access-t82d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.229649 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t82d2\" (UniqueName: \"kubernetes.io/projected/ea0bd28b-825b-4ba5-8838-f3bc695b0613-kube-api-access-t82d2\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.549222 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zxxk7_ea0bd28b-825b-4ba5-8838-f3bc695b0613/ovs-vswitchd/0.log" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.550026 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zxxk7" event={"ID":"ea0bd28b-825b-4ba5-8838-f3bc695b0613","Type":"ContainerDied","Data":"e861d179a141cd79e03a597643e1d9fe8963fcddf557ddb8cf8f6c8c663a5516"} Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.550084 4990 scope.go:117] "RemoveContainer" containerID="7294a3ec827e168d85a4051301d7006b196f54659804907e1ff0ca789fd1e50a" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.550101 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zxxk7" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.574924 4990 scope.go:117] "RemoveContainer" containerID="554fec54c55ea089726054ec418a50587851ed5044db689030610348162de7a7" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.585707 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-zxxk7"] Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.592308 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-zxxk7"] Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.599796 4990 scope.go:117] "RemoveContainer" containerID="7458a9d1d574a85e34d93996629001a19ee6d5414dafa8bf5462a2fecc3238db" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.883423 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" path="/var/lib/kubelet/pods/cca92a2a-2e3d-4e52-8ed8-a4dc709915b6/volumes" Oct 03 10:06:28 crc kubenswrapper[4990]: I1003 10:06:28.885943 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" path="/var/lib/kubelet/pods/ea0bd28b-825b-4ba5-8838-f3bc695b0613/volumes" Oct 03 10:06:32 crc kubenswrapper[4990]: I1003 10:06:32.544484 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:32 crc kubenswrapper[4990]: I1003 10:06:32.546805 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:32 crc kubenswrapper[4990]: I1003 10:06:32.590075 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:33 crc kubenswrapper[4990]: I1003 10:06:33.664207 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:33 crc kubenswrapper[4990]: I1003 10:06:33.718654 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t7qwk"] Oct 03 10:06:35 crc kubenswrapper[4990]: I1003 10:06:35.613582 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t7qwk" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerName="registry-server" containerID="cri-o://9e33f87f3c3ab85c4a60fc57ebdabfd0cdbd008024f6e737b82bd18df6ce1c30" gracePeriod=2 Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.650892 4990 generic.go:334] "Generic (PLEG): container finished" podID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerID="9e33f87f3c3ab85c4a60fc57ebdabfd0cdbd008024f6e737b82bd18df6ce1c30" exitCode=0 Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.650989 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7qwk" event={"ID":"c47dd420-cb30-4581-b944-77cd4a65b82c","Type":"ContainerDied","Data":"9e33f87f3c3ab85c4a60fc57ebdabfd0cdbd008024f6e737b82bd18df6ce1c30"} Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.702692 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.869861 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l4kn\" (UniqueName: \"kubernetes.io/projected/c47dd420-cb30-4581-b944-77cd4a65b82c-kube-api-access-4l4kn\") pod \"c47dd420-cb30-4581-b944-77cd4a65b82c\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.870026 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-catalog-content\") pod \"c47dd420-cb30-4581-b944-77cd4a65b82c\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.870168 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-utilities\") pod \"c47dd420-cb30-4581-b944-77cd4a65b82c\" (UID: \"c47dd420-cb30-4581-b944-77cd4a65b82c\") " Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.871825 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-utilities" (OuterVolumeSpecName: "utilities") pod "c47dd420-cb30-4581-b944-77cd4a65b82c" (UID: "c47dd420-cb30-4581-b944-77cd4a65b82c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.884737 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47dd420-cb30-4581-b944-77cd4a65b82c-kube-api-access-4l4kn" (OuterVolumeSpecName: "kube-api-access-4l4kn") pod "c47dd420-cb30-4581-b944-77cd4a65b82c" (UID: "c47dd420-cb30-4581-b944-77cd4a65b82c"). InnerVolumeSpecName "kube-api-access-4l4kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.972034 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.972100 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l4kn\" (UniqueName: \"kubernetes.io/projected/c47dd420-cb30-4581-b944-77cd4a65b82c-kube-api-access-4l4kn\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:38 crc kubenswrapper[4990]: I1003 10:06:38.989068 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c47dd420-cb30-4581-b944-77cd4a65b82c" (UID: "c47dd420-cb30-4581-b944-77cd4a65b82c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:06:39 crc kubenswrapper[4990]: I1003 10:06:39.073169 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c47dd420-cb30-4581-b944-77cd4a65b82c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:06:39 crc kubenswrapper[4990]: I1003 10:06:39.661179 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7qwk" event={"ID":"c47dd420-cb30-4581-b944-77cd4a65b82c","Type":"ContainerDied","Data":"3ef87ac3be659f70e87db7af90fff211678ae7b17de098e79047c693eb791e09"} Oct 03 10:06:39 crc kubenswrapper[4990]: I1003 10:06:39.661229 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7qwk" Oct 03 10:06:39 crc kubenswrapper[4990]: I1003 10:06:39.661543 4990 scope.go:117] "RemoveContainer" containerID="9e33f87f3c3ab85c4a60fc57ebdabfd0cdbd008024f6e737b82bd18df6ce1c30" Oct 03 10:06:39 crc kubenswrapper[4990]: I1003 10:06:39.694675 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t7qwk"] Oct 03 10:06:39 crc kubenswrapper[4990]: I1003 10:06:39.697652 4990 scope.go:117] "RemoveContainer" containerID="64327114b55bb8954095a6031fb00b73529e2ce884ca127e8be43019bb49daf7" Oct 03 10:06:39 crc kubenswrapper[4990]: I1003 10:06:39.702922 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t7qwk"] Oct 03 10:06:39 crc kubenswrapper[4990]: I1003 10:06:39.743708 4990 scope.go:117] "RemoveContainer" containerID="c41b5ac998fe38b126154649039837bad3415f1f66edb4446ee6b9eaef199eee" Oct 03 10:06:40 crc kubenswrapper[4990]: I1003 10:06:40.884166 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" path="/var/lib/kubelet/pods/c47dd420-cb30-4581-b944-77cd4a65b82c/volumes" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.108664 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77w9v"] Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109582 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerName="extract-content" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109598 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerName="extract-content" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109612 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109619 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109631 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-updater" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109638 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-updater" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109647 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerName="extract-utilities" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109653 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerName="extract-utilities" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109660 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="rsync" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109673 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="rsync" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109689 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerName="registry-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109696 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerName="registry-server" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109710 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109716 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109733 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="swift-recon-cron" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109739 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="swift-recon-cron" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109753 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109762 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109774 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109780 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109794 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109800 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-server" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109808 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-expirer" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109815 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-expirer" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109828 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-updater" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109833 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-updater" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109842 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109848 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109859 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109865 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109876 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109883 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109894 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109900 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-server" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109908 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server-init" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109915 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server-init" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109930 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-reaper" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109938 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-reaper" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109947 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109954 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-server" Oct 03 10:06:55 crc kubenswrapper[4990]: E1003 10:06:55.109967 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.109975 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110114 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110125 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-expirer" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110137 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110152 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110167 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovs-vswitchd" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110179 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-replicator" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110195 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-updater" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110203 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="container-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110210 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-reaper" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110221 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="swift-recon-cron" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110231 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c47dd420-cb30-4581-b944-77cd4a65b82c" containerName="registry-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110243 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0bd28b-825b-4ba5-8838-f3bc695b0613" containerName="ovsdb-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110256 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110267 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="rsync" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110281 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-updater" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110288 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-auditor" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110299 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="object-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.110307 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca92a2a-2e3d-4e52-8ed8-a4dc709915b6" containerName="account-server" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.111573 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.127123 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77w9v"] Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.200230 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-catalog-content\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.200347 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trcj\" (UniqueName: \"kubernetes.io/projected/3e39c15d-a611-4e43-8f61-0fb4f820cb39-kube-api-access-5trcj\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.200365 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-utilities\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.301472 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5trcj\" (UniqueName: \"kubernetes.io/projected/3e39c15d-a611-4e43-8f61-0fb4f820cb39-kube-api-access-5trcj\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.301547 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-utilities\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.301609 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-catalog-content\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.302245 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-utilities\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.302292 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-catalog-content\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.324449 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trcj\" (UniqueName: \"kubernetes.io/projected/3e39c15d-a611-4e43-8f61-0fb4f820cb39-kube-api-access-5trcj\") pod \"certified-operators-77w9v\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.443041 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:06:55 crc kubenswrapper[4990]: I1003 10:06:55.914276 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77w9v"] Oct 03 10:06:56 crc kubenswrapper[4990]: I1003 10:06:56.813791 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerID="929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f" exitCode=0 Oct 03 10:06:56 crc kubenswrapper[4990]: I1003 10:06:56.813857 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77w9v" event={"ID":"3e39c15d-a611-4e43-8f61-0fb4f820cb39","Type":"ContainerDied","Data":"929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f"} Oct 03 10:06:56 crc kubenswrapper[4990]: I1003 10:06:56.813912 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77w9v" event={"ID":"3e39c15d-a611-4e43-8f61-0fb4f820cb39","Type":"ContainerStarted","Data":"1383b992efd90b6e9981cb9745eead3c7483afaddecbd096000a964e57f51e3c"} Oct 03 10:06:58 crc kubenswrapper[4990]: I1003 10:06:58.831970 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerID="bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5" exitCode=0 Oct 03 10:06:58 crc kubenswrapper[4990]: I1003 10:06:58.832370 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77w9v" event={"ID":"3e39c15d-a611-4e43-8f61-0fb4f820cb39","Type":"ContainerDied","Data":"bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5"} Oct 03 10:06:59 crc kubenswrapper[4990]: I1003 10:06:59.842093 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77w9v" event={"ID":"3e39c15d-a611-4e43-8f61-0fb4f820cb39","Type":"ContainerStarted","Data":"1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7"} Oct 03 10:06:59 crc kubenswrapper[4990]: I1003 10:06:59.864118 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77w9v" podStartSLOduration=2.356912986 podStartE2EDuration="4.864093647s" podCreationTimestamp="2025-10-03 10:06:55 +0000 UTC" firstStartedPulling="2025-10-03 10:06:56.815888188 +0000 UTC m=+1398.612520045" lastFinishedPulling="2025-10-03 10:06:59.323068849 +0000 UTC m=+1401.119700706" observedRunningTime="2025-10-03 10:06:59.861785128 +0000 UTC m=+1401.658417005" watchObservedRunningTime="2025-10-03 10:06:59.864093647 +0000 UTC m=+1401.660725504" Oct 03 10:07:05 crc kubenswrapper[4990]: I1003 10:07:05.443977 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:07:05 crc kubenswrapper[4990]: I1003 10:07:05.444056 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:07:05 crc kubenswrapper[4990]: I1003 10:07:05.500393 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:07:05 crc kubenswrapper[4990]: I1003 10:07:05.932474 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:07:05 crc kubenswrapper[4990]: I1003 10:07:05.983306 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77w9v"] Oct 03 10:07:07 crc kubenswrapper[4990]: I1003 10:07:07.914458 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-77w9v" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerName="registry-server" containerID="cri-o://1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7" gracePeriod=2 Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.846154 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.923468 4990 generic.go:334] "Generic (PLEG): container finished" podID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerID="1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7" exitCode=0 Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.923563 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77w9v" Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.923576 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77w9v" event={"ID":"3e39c15d-a611-4e43-8f61-0fb4f820cb39","Type":"ContainerDied","Data":"1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7"} Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.923613 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77w9v" event={"ID":"3e39c15d-a611-4e43-8f61-0fb4f820cb39","Type":"ContainerDied","Data":"1383b992efd90b6e9981cb9745eead3c7483afaddecbd096000a964e57f51e3c"} Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.923638 4990 scope.go:117] "RemoveContainer" containerID="1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7" Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.950304 4990 scope.go:117] "RemoveContainer" containerID="bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5" Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.981936 4990 scope.go:117] "RemoveContainer" containerID="929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f" Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.996860 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-utilities\") pod \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.997024 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-catalog-content\") pod \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " Oct 03 10:07:08 crc kubenswrapper[4990]: I1003 10:07:08.997133 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5trcj\" (UniqueName: \"kubernetes.io/projected/3e39c15d-a611-4e43-8f61-0fb4f820cb39-kube-api-access-5trcj\") pod \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\" (UID: \"3e39c15d-a611-4e43-8f61-0fb4f820cb39\") " Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:08.998667 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-utilities" (OuterVolumeSpecName: "utilities") pod "3e39c15d-a611-4e43-8f61-0fb4f820cb39" (UID: "3e39c15d-a611-4e43-8f61-0fb4f820cb39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.004976 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e39c15d-a611-4e43-8f61-0fb4f820cb39-kube-api-access-5trcj" (OuterVolumeSpecName: "kube-api-access-5trcj") pod "3e39c15d-a611-4e43-8f61-0fb4f820cb39" (UID: "3e39c15d-a611-4e43-8f61-0fb4f820cb39"). InnerVolumeSpecName "kube-api-access-5trcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.012668 4990 scope.go:117] "RemoveContainer" containerID="1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7" Oct 03 10:07:09 crc kubenswrapper[4990]: E1003 10:07:09.013233 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7\": container with ID starting with 1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7 not found: ID does not exist" containerID="1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.013297 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7"} err="failed to get container status \"1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7\": rpc error: code = NotFound desc = could not find container \"1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7\": container with ID starting with 1c0ca7ef88a0351e482bb2bfe156640514e1e725f1b242d659a9578250df9da7 not found: ID does not exist" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.013329 4990 scope.go:117] "RemoveContainer" containerID="bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5" Oct 03 10:07:09 crc kubenswrapper[4990]: E1003 10:07:09.013660 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5\": container with ID starting with bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5 not found: ID does not exist" containerID="bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.013685 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5"} err="failed to get container status \"bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5\": rpc error: code = NotFound desc = could not find container \"bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5\": container with ID starting with bc1af155977f694613b5256b0ae7706cd46e142eb5cf2342b1bd8208fb94a2a5 not found: ID does not exist" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.013699 4990 scope.go:117] "RemoveContainer" containerID="929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f" Oct 03 10:07:09 crc kubenswrapper[4990]: E1003 10:07:09.013917 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f\": container with ID starting with 929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f not found: ID does not exist" containerID="929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.013947 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f"} err="failed to get container status \"929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f\": rpc error: code = NotFound desc = could not find container \"929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f\": container with ID starting with 929fbe981308c8558a59d6229480493fcb485db1ceb510f579596c1d2cc4042f not found: ID does not exist" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.098986 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5trcj\" (UniqueName: \"kubernetes.io/projected/3e39c15d-a611-4e43-8f61-0fb4f820cb39-kube-api-access-5trcj\") on node \"crc\" DevicePath \"\"" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.099037 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.570685 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e39c15d-a611-4e43-8f61-0fb4f820cb39" (UID: "3e39c15d-a611-4e43-8f61-0fb4f820cb39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.606423 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e39c15d-a611-4e43-8f61-0fb4f820cb39-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.864979 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77w9v"] Oct 03 10:07:09 crc kubenswrapper[4990]: I1003 10:07:09.872880 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-77w9v"] Oct 03 10:07:10 crc kubenswrapper[4990]: I1003 10:07:10.880138 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" path="/var/lib/kubelet/pods/3e39c15d-a611-4e43-8f61-0fb4f820cb39/volumes" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.414097 4990 scope.go:117] "RemoveContainer" containerID="0f36f93a58a4abcefc2399a68eec1ce5537116c70acf225f4a2cad1ad503ac9d" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.439495 4990 scope.go:117] "RemoveContainer" containerID="18687cd53680abd930c5680f7523fd3629930d839ab92a472f3af82c58a7c9b4" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.472690 4990 scope.go:117] "RemoveContainer" containerID="1633e1faa9b6a2e02ae1af8387687c60cead6ef65411004b7d61911b8219235d" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.489744 4990 scope.go:117] "RemoveContainer" containerID="a97875ccc65a4d94561b8c329188369807b3b4c23aac3230cdb73d4704ca67de" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.517343 4990 scope.go:117] "RemoveContainer" containerID="d5ee1abf7a163799410b1511c4a9e8f15feba5f3acad686e5c5172038d090f40" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.564933 4990 scope.go:117] "RemoveContainer" containerID="fcf62b1283b33d38aeee0f2d91a2e4ca85cd88ae3945b5041a9e1aea0b6d1774" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.582882 4990 scope.go:117] "RemoveContainer" containerID="605055cfc80d36f87f85bf950602b2ae0f9b1559ba2ee7e06b749c78a8b9e130" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.609474 4990 scope.go:117] "RemoveContainer" containerID="f7eaf253d3aa5512f25c1f9d5ffce15dc33c5cf1d6cff9139aebbef8d767c346" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.638662 4990 scope.go:117] "RemoveContainer" containerID="21d85a37e8f6cc7bb88e6ce18cde1192ecd28961bc25f414c7f82da286e51f23" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.666443 4990 scope.go:117] "RemoveContainer" containerID="0e8e95a6d709c8f44add9d81ff967c2d33db6287fdfd87854e697b2031a80bfe" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.684083 4990 scope.go:117] "RemoveContainer" containerID="5218c4ea0a9ac0c5d479f39fef59faf1bab332d65ddb4788247bc0adbb6de898" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.703842 4990 scope.go:117] "RemoveContainer" containerID="1f910d8c95b928da975f02b11022b10642c780d91d0a9a470d6c5f0f5abd6093" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.728314 4990 scope.go:117] "RemoveContainer" containerID="fd9e74c91e6a08463958110965cb2cda9b429bda7c5d992f466ea00d8861ed7b" Oct 03 10:07:44 crc kubenswrapper[4990]: I1003 10:07:44.765009 4990 scope.go:117] "RemoveContainer" containerID="1ca4954d0191dc08f83434b422e932c5bfb8ed7e5628dd4ee7b78dfa04d4c8c4" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.772360 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d2p7z"] Oct 03 10:08:15 crc kubenswrapper[4990]: E1003 10:08:15.773024 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerName="extract-content" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.773036 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerName="extract-content" Oct 03 10:08:15 crc kubenswrapper[4990]: E1003 10:08:15.773047 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerName="extract-utilities" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.773053 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerName="extract-utilities" Oct 03 10:08:15 crc kubenswrapper[4990]: E1003 10:08:15.773071 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerName="registry-server" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.773078 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerName="registry-server" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.773229 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e39c15d-a611-4e43-8f61-0fb4f820cb39" containerName="registry-server" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.774200 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.785564 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2p7z"] Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.867096 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-utilities\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.867194 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7pr6\" (UniqueName: \"kubernetes.io/projected/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-kube-api-access-c7pr6\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.867240 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-catalog-content\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.986096 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7pr6\" (UniqueName: \"kubernetes.io/projected/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-kube-api-access-c7pr6\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.986201 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-catalog-content\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.986262 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-utilities\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.986828 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-utilities\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:15 crc kubenswrapper[4990]: I1003 10:08:15.986966 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-catalog-content\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:16 crc kubenswrapper[4990]: I1003 10:08:16.014415 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7pr6\" (UniqueName: \"kubernetes.io/projected/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-kube-api-access-c7pr6\") pod \"redhat-marketplace-d2p7z\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:16 crc kubenswrapper[4990]: I1003 10:08:16.098484 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:16 crc kubenswrapper[4990]: I1003 10:08:16.530180 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2p7z"] Oct 03 10:08:17 crc kubenswrapper[4990]: I1003 10:08:17.486686 4990 generic.go:334] "Generic (PLEG): container finished" podID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerID="8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149" exitCode=0 Oct 03 10:08:17 crc kubenswrapper[4990]: I1003 10:08:17.486764 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2p7z" event={"ID":"3fc41f4f-3c19-4f23-b344-ea20a0f646ca","Type":"ContainerDied","Data":"8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149"} Oct 03 10:08:17 crc kubenswrapper[4990]: I1003 10:08:17.486987 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2p7z" event={"ID":"3fc41f4f-3c19-4f23-b344-ea20a0f646ca","Type":"ContainerStarted","Data":"ce87fd1884ace2b4b311b9645a98be979f61937c1350caeac1f694a3f8c23e38"} Oct 03 10:08:18 crc kubenswrapper[4990]: I1003 10:08:18.496757 4990 generic.go:334] "Generic (PLEG): container finished" podID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerID="881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e" exitCode=0 Oct 03 10:08:18 crc kubenswrapper[4990]: I1003 10:08:18.496849 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2p7z" event={"ID":"3fc41f4f-3c19-4f23-b344-ea20a0f646ca","Type":"ContainerDied","Data":"881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e"} Oct 03 10:08:19 crc kubenswrapper[4990]: I1003 10:08:19.507284 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2p7z" event={"ID":"3fc41f4f-3c19-4f23-b344-ea20a0f646ca","Type":"ContainerStarted","Data":"d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f"} Oct 03 10:08:19 crc kubenswrapper[4990]: I1003 10:08:19.530996 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d2p7z" podStartSLOduration=2.9994761949999997 podStartE2EDuration="4.530971649s" podCreationTimestamp="2025-10-03 10:08:15 +0000 UTC" firstStartedPulling="2025-10-03 10:08:17.491043172 +0000 UTC m=+1479.287675029" lastFinishedPulling="2025-10-03 10:08:19.022538616 +0000 UTC m=+1480.819170483" observedRunningTime="2025-10-03 10:08:19.525948662 +0000 UTC m=+1481.322580519" watchObservedRunningTime="2025-10-03 10:08:19.530971649 +0000 UTC m=+1481.327603506" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.754905 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6zxxd"] Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.756893 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.767241 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zxxd"] Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.853457 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-catalog-content\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.853615 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4v6j\" (UniqueName: \"kubernetes.io/projected/63831f90-b33b-4b8a-b923-d58700bf8d79-kube-api-access-l4v6j\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.853757 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-utilities\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.955454 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4v6j\" (UniqueName: \"kubernetes.io/projected/63831f90-b33b-4b8a-b923-d58700bf8d79-kube-api-access-l4v6j\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.955543 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-utilities\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.955642 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-catalog-content\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.956155 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-utilities\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.956245 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-catalog-content\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:20 crc kubenswrapper[4990]: I1003 10:08:20.981751 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4v6j\" (UniqueName: \"kubernetes.io/projected/63831f90-b33b-4b8a-b923-d58700bf8d79-kube-api-access-l4v6j\") pod \"community-operators-6zxxd\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:21 crc kubenswrapper[4990]: I1003 10:08:21.093820 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:21 crc kubenswrapper[4990]: I1003 10:08:21.570175 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zxxd"] Oct 03 10:08:21 crc kubenswrapper[4990]: W1003 10:08:21.578208 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63831f90_b33b_4b8a_b923_d58700bf8d79.slice/crio-3027f912ce2ba81daa2611342e9a1ef84b46e28d2a6b3849d098f2ddee3764b1 WatchSource:0}: Error finding container 3027f912ce2ba81daa2611342e9a1ef84b46e28d2a6b3849d098f2ddee3764b1: Status 404 returned error can't find the container with id 3027f912ce2ba81daa2611342e9a1ef84b46e28d2a6b3849d098f2ddee3764b1 Oct 03 10:08:22 crc kubenswrapper[4990]: I1003 10:08:22.544135 4990 generic.go:334] "Generic (PLEG): container finished" podID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerID="9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173" exitCode=0 Oct 03 10:08:22 crc kubenswrapper[4990]: I1003 10:08:22.544215 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zxxd" event={"ID":"63831f90-b33b-4b8a-b923-d58700bf8d79","Type":"ContainerDied","Data":"9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173"} Oct 03 10:08:22 crc kubenswrapper[4990]: I1003 10:08:22.544439 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zxxd" event={"ID":"63831f90-b33b-4b8a-b923-d58700bf8d79","Type":"ContainerStarted","Data":"3027f912ce2ba81daa2611342e9a1ef84b46e28d2a6b3849d098f2ddee3764b1"} Oct 03 10:08:24 crc kubenswrapper[4990]: I1003 10:08:24.571951 4990 generic.go:334] "Generic (PLEG): container finished" podID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerID="e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07" exitCode=0 Oct 03 10:08:24 crc kubenswrapper[4990]: I1003 10:08:24.572045 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zxxd" event={"ID":"63831f90-b33b-4b8a-b923-d58700bf8d79","Type":"ContainerDied","Data":"e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07"} Oct 03 10:08:25 crc kubenswrapper[4990]: I1003 10:08:25.303863 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:08:25 crc kubenswrapper[4990]: I1003 10:08:25.304476 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:08:25 crc kubenswrapper[4990]: I1003 10:08:25.581380 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zxxd" event={"ID":"63831f90-b33b-4b8a-b923-d58700bf8d79","Type":"ContainerStarted","Data":"90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8"} Oct 03 10:08:25 crc kubenswrapper[4990]: I1003 10:08:25.605100 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6zxxd" podStartSLOduration=2.979862517 podStartE2EDuration="5.605074571s" podCreationTimestamp="2025-10-03 10:08:20 +0000 UTC" firstStartedPulling="2025-10-03 10:08:22.546422003 +0000 UTC m=+1484.343053860" lastFinishedPulling="2025-10-03 10:08:25.171634057 +0000 UTC m=+1486.968265914" observedRunningTime="2025-10-03 10:08:25.597861999 +0000 UTC m=+1487.394493846" watchObservedRunningTime="2025-10-03 10:08:25.605074571 +0000 UTC m=+1487.401706458" Oct 03 10:08:26 crc kubenswrapper[4990]: I1003 10:08:26.099560 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:26 crc kubenswrapper[4990]: I1003 10:08:26.099778 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:26 crc kubenswrapper[4990]: I1003 10:08:26.149703 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:26 crc kubenswrapper[4990]: I1003 10:08:26.635912 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:27 crc kubenswrapper[4990]: I1003 10:08:27.346074 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2p7z"] Oct 03 10:08:28 crc kubenswrapper[4990]: I1003 10:08:28.605090 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d2p7z" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerName="registry-server" containerID="cri-o://d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f" gracePeriod=2 Oct 03 10:08:28 crc kubenswrapper[4990]: I1003 10:08:28.987274 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.073576 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7pr6\" (UniqueName: \"kubernetes.io/projected/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-kube-api-access-c7pr6\") pod \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.074023 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-utilities\") pod \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.074089 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-catalog-content\") pod \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\" (UID: \"3fc41f4f-3c19-4f23-b344-ea20a0f646ca\") " Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.075629 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-utilities" (OuterVolumeSpecName: "utilities") pod "3fc41f4f-3c19-4f23-b344-ea20a0f646ca" (UID: "3fc41f4f-3c19-4f23-b344-ea20a0f646ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.081749 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-kube-api-access-c7pr6" (OuterVolumeSpecName: "kube-api-access-c7pr6") pod "3fc41f4f-3c19-4f23-b344-ea20a0f646ca" (UID: "3fc41f4f-3c19-4f23-b344-ea20a0f646ca"). InnerVolumeSpecName "kube-api-access-c7pr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.089151 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fc41f4f-3c19-4f23-b344-ea20a0f646ca" (UID: "3fc41f4f-3c19-4f23-b344-ea20a0f646ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.175262 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7pr6\" (UniqueName: \"kubernetes.io/projected/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-kube-api-access-c7pr6\") on node \"crc\" DevicePath \"\"" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.175297 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.175309 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc41f4f-3c19-4f23-b344-ea20a0f646ca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.615263 4990 generic.go:334] "Generic (PLEG): container finished" podID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerID="d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f" exitCode=0 Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.615315 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2p7z" event={"ID":"3fc41f4f-3c19-4f23-b344-ea20a0f646ca","Type":"ContainerDied","Data":"d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f"} Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.615328 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2p7z" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.615344 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2p7z" event={"ID":"3fc41f4f-3c19-4f23-b344-ea20a0f646ca","Type":"ContainerDied","Data":"ce87fd1884ace2b4b311b9645a98be979f61937c1350caeac1f694a3f8c23e38"} Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.615365 4990 scope.go:117] "RemoveContainer" containerID="d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.633163 4990 scope.go:117] "RemoveContainer" containerID="881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.648169 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2p7z"] Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.653435 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2p7z"] Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.671571 4990 scope.go:117] "RemoveContainer" containerID="8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.697943 4990 scope.go:117] "RemoveContainer" containerID="d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f" Oct 03 10:08:29 crc kubenswrapper[4990]: E1003 10:08:29.698193 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f\": container with ID starting with d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f not found: ID does not exist" containerID="d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.698227 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f"} err="failed to get container status \"d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f\": rpc error: code = NotFound desc = could not find container \"d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f\": container with ID starting with d5e9c74d033b182b98aa5a0f9537eb33cac02fc8fc8798dac0e4857697a4338f not found: ID does not exist" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.698246 4990 scope.go:117] "RemoveContainer" containerID="881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e" Oct 03 10:08:29 crc kubenswrapper[4990]: E1003 10:08:29.698757 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e\": container with ID starting with 881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e not found: ID does not exist" containerID="881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.698779 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e"} err="failed to get container status \"881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e\": rpc error: code = NotFound desc = could not find container \"881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e\": container with ID starting with 881f716742bbd86988e43ee3b9c46c0c42732f8beaa11d852ddfa91df334620e not found: ID does not exist" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.698801 4990 scope.go:117] "RemoveContainer" containerID="8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149" Oct 03 10:08:29 crc kubenswrapper[4990]: E1003 10:08:29.699473 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149\": container with ID starting with 8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149 not found: ID does not exist" containerID="8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149" Oct 03 10:08:29 crc kubenswrapper[4990]: I1003 10:08:29.699566 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149"} err="failed to get container status \"8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149\": rpc error: code = NotFound desc = could not find container \"8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149\": container with ID starting with 8f20ce538c5b7a347c53f56e4ad8aeaba017cd0e07c91f90fdf137b1f20ad149 not found: ID does not exist" Oct 03 10:08:30 crc kubenswrapper[4990]: I1003 10:08:30.885717 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" path="/var/lib/kubelet/pods/3fc41f4f-3c19-4f23-b344-ea20a0f646ca/volumes" Oct 03 10:08:31 crc kubenswrapper[4990]: I1003 10:08:31.094570 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:31 crc kubenswrapper[4990]: I1003 10:08:31.094658 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:31 crc kubenswrapper[4990]: I1003 10:08:31.140491 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:31 crc kubenswrapper[4990]: I1003 10:08:31.678859 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:32 crc kubenswrapper[4990]: I1003 10:08:32.743109 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6zxxd"] Oct 03 10:08:33 crc kubenswrapper[4990]: I1003 10:08:33.654230 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6zxxd" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerName="registry-server" containerID="cri-o://90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8" gracePeriod=2 Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.032128 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.149762 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4v6j\" (UniqueName: \"kubernetes.io/projected/63831f90-b33b-4b8a-b923-d58700bf8d79-kube-api-access-l4v6j\") pod \"63831f90-b33b-4b8a-b923-d58700bf8d79\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.150033 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-catalog-content\") pod \"63831f90-b33b-4b8a-b923-d58700bf8d79\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.150183 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-utilities\") pod \"63831f90-b33b-4b8a-b923-d58700bf8d79\" (UID: \"63831f90-b33b-4b8a-b923-d58700bf8d79\") " Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.151116 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-utilities" (OuterVolumeSpecName: "utilities") pod "63831f90-b33b-4b8a-b923-d58700bf8d79" (UID: "63831f90-b33b-4b8a-b923-d58700bf8d79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.156019 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63831f90-b33b-4b8a-b923-d58700bf8d79-kube-api-access-l4v6j" (OuterVolumeSpecName: "kube-api-access-l4v6j") pod "63831f90-b33b-4b8a-b923-d58700bf8d79" (UID: "63831f90-b33b-4b8a-b923-d58700bf8d79"). InnerVolumeSpecName "kube-api-access-l4v6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.213875 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63831f90-b33b-4b8a-b923-d58700bf8d79" (UID: "63831f90-b33b-4b8a-b923-d58700bf8d79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.251210 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4v6j\" (UniqueName: \"kubernetes.io/projected/63831f90-b33b-4b8a-b923-d58700bf8d79-kube-api-access-l4v6j\") on node \"crc\" DevicePath \"\"" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.251357 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.251417 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63831f90-b33b-4b8a-b923-d58700bf8d79-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.663988 4990 generic.go:334] "Generic (PLEG): container finished" podID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerID="90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8" exitCode=0 Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.664109 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zxxd" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.664095 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zxxd" event={"ID":"63831f90-b33b-4b8a-b923-d58700bf8d79","Type":"ContainerDied","Data":"90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8"} Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.664442 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zxxd" event={"ID":"63831f90-b33b-4b8a-b923-d58700bf8d79","Type":"ContainerDied","Data":"3027f912ce2ba81daa2611342e9a1ef84b46e28d2a6b3849d098f2ddee3764b1"} Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.664477 4990 scope.go:117] "RemoveContainer" containerID="90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.697837 4990 scope.go:117] "RemoveContainer" containerID="e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.700026 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6zxxd"] Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.705362 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6zxxd"] Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.727061 4990 scope.go:117] "RemoveContainer" containerID="9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.746041 4990 scope.go:117] "RemoveContainer" containerID="90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8" Oct 03 10:08:34 crc kubenswrapper[4990]: E1003 10:08:34.746683 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8\": container with ID starting with 90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8 not found: ID does not exist" containerID="90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.746794 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8"} err="failed to get container status \"90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8\": rpc error: code = NotFound desc = could not find container \"90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8\": container with ID starting with 90c502f9ac9d0a5b1a6dd2ab394a85efe85987068cfad7197d70e0b039d47df8 not found: ID does not exist" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.746838 4990 scope.go:117] "RemoveContainer" containerID="e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07" Oct 03 10:08:34 crc kubenswrapper[4990]: E1003 10:08:34.747488 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07\": container with ID starting with e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07 not found: ID does not exist" containerID="e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.747633 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07"} err="failed to get container status \"e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07\": rpc error: code = NotFound desc = could not find container \"e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07\": container with ID starting with e494ec949188bd225b86036503273c8ecddeafc1b1bda33afb26472fa8983b07 not found: ID does not exist" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.747746 4990 scope.go:117] "RemoveContainer" containerID="9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173" Oct 03 10:08:34 crc kubenswrapper[4990]: E1003 10:08:34.748192 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173\": container with ID starting with 9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173 not found: ID does not exist" containerID="9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.748229 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173"} err="failed to get container status \"9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173\": rpc error: code = NotFound desc = could not find container \"9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173\": container with ID starting with 9e8ebf77565080d0c0dbb87fdb333333e082964e14de230c96b8bf14e00f2173 not found: ID does not exist" Oct 03 10:08:34 crc kubenswrapper[4990]: I1003 10:08:34.882214 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" path="/var/lib/kubelet/pods/63831f90-b33b-4b8a-b923-d58700bf8d79/volumes" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.022260 4990 scope.go:117] "RemoveContainer" containerID="f9db0b13dd512f4e09ae656d0b18ac85d17dbf8d25dbbc39405095596f9d32d0" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.056972 4990 scope.go:117] "RemoveContainer" containerID="5709db94fbf37881b7dcb88ca8ea55ff5d3a16e10fcac9dff43ec457d9265d4c" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.084326 4990 scope.go:117] "RemoveContainer" containerID="736cd5636cb503db1231834fc566fac0514837d857cdaa40203947278154656f" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.147227 4990 scope.go:117] "RemoveContainer" containerID="18376407c1f5e9a0a8cae570c29ff378a32d84fb0a1eae862251a332518d5ed9" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.186923 4990 scope.go:117] "RemoveContainer" containerID="ef11333bf94355cba01475b1c22a8097568481d7e330f677fc3d3c2369b3bdc4" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.222921 4990 scope.go:117] "RemoveContainer" containerID="035fb51c6a8dbf9baf89703a1824ed1de4a14267cb2db1bae664bb9a91aaa7e9" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.260127 4990 scope.go:117] "RemoveContainer" containerID="6bcce0bfbc646707b7148f2d86f412c6ad5b8f72cdfc3ed1fe2a9dd30a9f075f" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.275111 4990 scope.go:117] "RemoveContainer" containerID="235ff221ea4844e3ddd9a837dd1b1ff70199e09ac1b1d727c1c8b458b0e2c2b9" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.306745 4990 scope.go:117] "RemoveContainer" containerID="b36b62113bc8b0a5153619c9ce15a20111b4cef90bbb719726c4d653486f1bd1" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.324082 4990 scope.go:117] "RemoveContainer" containerID="f18ee0e55f5d16bc0f91cca6c1eec4df5e8474256ea5b13b1d63ebb6a812ece3" Oct 03 10:08:45 crc kubenswrapper[4990]: I1003 10:08:45.354929 4990 scope.go:117] "RemoveContainer" containerID="2cded7f28a6d70eec113a2c1f2cd4403da85f5f7ed00423c09b48119b4dd6575" Oct 03 10:08:55 crc kubenswrapper[4990]: I1003 10:08:55.303499 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:08:55 crc kubenswrapper[4990]: I1003 10:08:55.304061 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:09:25 crc kubenswrapper[4990]: I1003 10:09:25.304629 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:09:25 crc kubenswrapper[4990]: I1003 10:09:25.305492 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:09:25 crc kubenswrapper[4990]: I1003 10:09:25.305598 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:09:25 crc kubenswrapper[4990]: I1003 10:09:25.306350 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:09:25 crc kubenswrapper[4990]: I1003 10:09:25.306531 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" gracePeriod=600 Oct 03 10:09:25 crc kubenswrapper[4990]: E1003 10:09:25.428995 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:09:26 crc kubenswrapper[4990]: I1003 10:09:26.067929 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" exitCode=0 Oct 03 10:09:26 crc kubenswrapper[4990]: I1003 10:09:26.067983 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71"} Oct 03 10:09:26 crc kubenswrapper[4990]: I1003 10:09:26.068457 4990 scope.go:117] "RemoveContainer" containerID="d16d23839f71b10620c63e3b9b22fb6701868b850a588763048d7da4f3291db7" Oct 03 10:09:26 crc kubenswrapper[4990]: I1003 10:09:26.069090 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:09:26 crc kubenswrapper[4990]: E1003 10:09:26.069350 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:09:40 crc kubenswrapper[4990]: I1003 10:09:40.872129 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:09:40 crc kubenswrapper[4990]: E1003 10:09:40.872758 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.537393 4990 scope.go:117] "RemoveContainer" containerID="27ee6574dd58ed3cc5609e5e3cf295930d7487f08737578441411bd7326a416d" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.562853 4990 scope.go:117] "RemoveContainer" containerID="603eafaafed14cad283cbaa3059780f5a1da6d31cc82934c8ad3431436581baf" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.586276 4990 scope.go:117] "RemoveContainer" containerID="73e5d97a45dc912b3b85601291d4c5afec6117cd29fbe2b1bc76a7237b673460" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.628381 4990 scope.go:117] "RemoveContainer" containerID="ae63f4c738228664295d529f1f2ce00d483a4a4e2629816aaf711c6d25cf258f" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.645187 4990 scope.go:117] "RemoveContainer" containerID="4128adaddf77e7ec00c9f9e607647913aef983b6901b7e19ceeba0f2e8ed8bbd" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.668082 4990 scope.go:117] "RemoveContainer" containerID="770d83938a22e570b023d61ae7a6544022cebee19724127ed29717752e5252c0" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.710963 4990 scope.go:117] "RemoveContainer" containerID="164df144161995268ea85b29c9b4a158f835548115b6ddb54a76c9316445f8ad" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.731569 4990 scope.go:117] "RemoveContainer" containerID="adffc8be595f8d01d2798d1490842d5e7a8588b528aebf87a6717bb8e399160b" Oct 03 10:09:45 crc kubenswrapper[4990]: I1003 10:09:45.749525 4990 scope.go:117] "RemoveContainer" containerID="20d2d25ce9bcc245f400e40ad8c0300acef06aacb84e0cd069565068a37a1714" Oct 03 10:09:55 crc kubenswrapper[4990]: I1003 10:09:55.871499 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:09:55 crc kubenswrapper[4990]: E1003 10:09:55.872383 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:10:09 crc kubenswrapper[4990]: I1003 10:10:09.872376 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:10:09 crc kubenswrapper[4990]: E1003 10:10:09.873405 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:10:22 crc kubenswrapper[4990]: I1003 10:10:22.872228 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:10:22 crc kubenswrapper[4990]: E1003 10:10:22.872997 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:10:36 crc kubenswrapper[4990]: I1003 10:10:36.871824 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:10:36 crc kubenswrapper[4990]: E1003 10:10:36.872766 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:10:45 crc kubenswrapper[4990]: I1003 10:10:45.873542 4990 scope.go:117] "RemoveContainer" containerID="56593d82c59221b08ee85182fb62a7183bf2c14dede1dee8ab760ea79c4fb46a" Oct 03 10:10:45 crc kubenswrapper[4990]: I1003 10:10:45.940343 4990 scope.go:117] "RemoveContainer" containerID="72d2eb3b6894a8014c179dca4994554aac84f2a81598f222f9283646a4f6d6e6" Oct 03 10:10:47 crc kubenswrapper[4990]: I1003 10:10:47.872144 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:10:47 crc kubenswrapper[4990]: E1003 10:10:47.872704 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:11:01 crc kubenswrapper[4990]: I1003 10:11:01.872301 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:11:01 crc kubenswrapper[4990]: E1003 10:11:01.873085 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:11:13 crc kubenswrapper[4990]: I1003 10:11:13.873289 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:11:13 crc kubenswrapper[4990]: E1003 10:11:13.875948 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:11:24 crc kubenswrapper[4990]: I1003 10:11:24.873285 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:11:24 crc kubenswrapper[4990]: E1003 10:11:24.874085 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:11:37 crc kubenswrapper[4990]: I1003 10:11:37.871558 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:11:37 crc kubenswrapper[4990]: E1003 10:11:37.872376 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:11:46 crc kubenswrapper[4990]: I1003 10:11:46.013098 4990 scope.go:117] "RemoveContainer" containerID="dcbd5d737932527d27f958573242ee91326051d3e9490ecdb70837f5f700c9d3" Oct 03 10:11:51 crc kubenswrapper[4990]: I1003 10:11:51.872628 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:11:51 crc kubenswrapper[4990]: E1003 10:11:51.873402 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:12:04 crc kubenswrapper[4990]: I1003 10:12:04.872594 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:12:04 crc kubenswrapper[4990]: E1003 10:12:04.873543 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:12:17 crc kubenswrapper[4990]: I1003 10:12:17.892135 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:12:17 crc kubenswrapper[4990]: E1003 10:12:17.893603 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:12:31 crc kubenswrapper[4990]: I1003 10:12:31.871631 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:12:31 crc kubenswrapper[4990]: E1003 10:12:31.872374 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:12:43 crc kubenswrapper[4990]: I1003 10:12:43.871711 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:12:43 crc kubenswrapper[4990]: E1003 10:12:43.872419 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:12:46 crc kubenswrapper[4990]: I1003 10:12:46.099943 4990 scope.go:117] "RemoveContainer" containerID="00f4875ec8272b3551b58fd88f4ed38c7b63402186dec2ee6b42c19ede373391" Oct 03 10:12:46 crc kubenswrapper[4990]: I1003 10:12:46.127295 4990 scope.go:117] "RemoveContainer" containerID="0bc80b0179018a6f6162e2e984236ff0b86684b8b755febd2da28a80c9bd0466" Oct 03 10:12:54 crc kubenswrapper[4990]: I1003 10:12:54.872721 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:12:54 crc kubenswrapper[4990]: E1003 10:12:54.873786 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:13:05 crc kubenswrapper[4990]: I1003 10:13:05.872194 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:13:05 crc kubenswrapper[4990]: E1003 10:13:05.873062 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:13:19 crc kubenswrapper[4990]: I1003 10:13:19.871401 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:13:19 crc kubenswrapper[4990]: E1003 10:13:19.872141 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:13:31 crc kubenswrapper[4990]: I1003 10:13:31.871629 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:13:31 crc kubenswrapper[4990]: E1003 10:13:31.872332 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:13:44 crc kubenswrapper[4990]: I1003 10:13:44.872315 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:13:44 crc kubenswrapper[4990]: E1003 10:13:44.872970 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:13:57 crc kubenswrapper[4990]: I1003 10:13:57.871362 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:13:57 crc kubenswrapper[4990]: E1003 10:13:57.872928 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:14:10 crc kubenswrapper[4990]: I1003 10:14:10.872129 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:14:10 crc kubenswrapper[4990]: E1003 10:14:10.872903 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:14:24 crc kubenswrapper[4990]: I1003 10:14:24.873587 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:14:24 crc kubenswrapper[4990]: E1003 10:14:24.875344 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:14:35 crc kubenswrapper[4990]: I1003 10:14:35.872889 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:14:36 crc kubenswrapper[4990]: I1003 10:14:36.662982 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"b2acf6458a6d39b1d71bee331b01be7fecf262830a0ef956a66f563795081966"} Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.165874 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d"] Oct 03 10:15:00 crc kubenswrapper[4990]: E1003 10:15:00.166895 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerName="extract-content" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.166920 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerName="extract-content" Oct 03 10:15:00 crc kubenswrapper[4990]: E1003 10:15:00.166946 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerName="extract-content" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.166957 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerName="extract-content" Oct 03 10:15:00 crc kubenswrapper[4990]: E1003 10:15:00.166987 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerName="extract-utilities" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.166998 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerName="extract-utilities" Oct 03 10:15:00 crc kubenswrapper[4990]: E1003 10:15:00.167020 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.167028 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4990]: E1003 10:15:00.167045 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.167053 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4990]: E1003 10:15:00.167069 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerName="extract-utilities" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.167079 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerName="extract-utilities" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.167321 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="63831f90-b33b-4b8a-b923-d58700bf8d79" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.167342 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc41f4f-3c19-4f23-b344-ea20a0f646ca" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.168068 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.170734 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.170799 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.181641 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d"] Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.235166 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8b19cc0d-3846-4242-8991-fbe787cc9680-secret-volume\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.235255 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b19cc0d-3846-4242-8991-fbe787cc9680-config-volume\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.235331 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dbbx\" (UniqueName: \"kubernetes.io/projected/8b19cc0d-3846-4242-8991-fbe787cc9680-kube-api-access-8dbbx\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.337629 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8b19cc0d-3846-4242-8991-fbe787cc9680-secret-volume\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.337773 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b19cc0d-3846-4242-8991-fbe787cc9680-config-volume\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.337839 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dbbx\" (UniqueName: \"kubernetes.io/projected/8b19cc0d-3846-4242-8991-fbe787cc9680-kube-api-access-8dbbx\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.340533 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b19cc0d-3846-4242-8991-fbe787cc9680-config-volume\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.344112 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8b19cc0d-3846-4242-8991-fbe787cc9680-secret-volume\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.359047 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dbbx\" (UniqueName: \"kubernetes.io/projected/8b19cc0d-3846-4242-8991-fbe787cc9680-kube-api-access-8dbbx\") pod \"collect-profiles-29324775-95p6d\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.490022 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.740344 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d"] Oct 03 10:15:00 crc kubenswrapper[4990]: I1003 10:15:00.850277 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" event={"ID":"8b19cc0d-3846-4242-8991-fbe787cc9680","Type":"ContainerStarted","Data":"06dd9557295f28b7e8bfc0d95e692c3ed90a2569cc9477d3087e81f89ce0de89"} Oct 03 10:15:01 crc kubenswrapper[4990]: I1003 10:15:01.861958 4990 generic.go:334] "Generic (PLEG): container finished" podID="8b19cc0d-3846-4242-8991-fbe787cc9680" containerID="1d0967b9b6e7aaf644eba392f856bc81d0cb03bfb5c16e1b642676daaec05290" exitCode=0 Oct 03 10:15:01 crc kubenswrapper[4990]: I1003 10:15:01.862114 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" event={"ID":"8b19cc0d-3846-4242-8991-fbe787cc9680","Type":"ContainerDied","Data":"1d0967b9b6e7aaf644eba392f856bc81d0cb03bfb5c16e1b642676daaec05290"} Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.182400 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.279032 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8b19cc0d-3846-4242-8991-fbe787cc9680-secret-volume\") pod \"8b19cc0d-3846-4242-8991-fbe787cc9680\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.279120 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b19cc0d-3846-4242-8991-fbe787cc9680-config-volume\") pod \"8b19cc0d-3846-4242-8991-fbe787cc9680\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.279229 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dbbx\" (UniqueName: \"kubernetes.io/projected/8b19cc0d-3846-4242-8991-fbe787cc9680-kube-api-access-8dbbx\") pod \"8b19cc0d-3846-4242-8991-fbe787cc9680\" (UID: \"8b19cc0d-3846-4242-8991-fbe787cc9680\") " Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.279972 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b19cc0d-3846-4242-8991-fbe787cc9680-config-volume" (OuterVolumeSpecName: "config-volume") pod "8b19cc0d-3846-4242-8991-fbe787cc9680" (UID: "8b19cc0d-3846-4242-8991-fbe787cc9680"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.286009 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b19cc0d-3846-4242-8991-fbe787cc9680-kube-api-access-8dbbx" (OuterVolumeSpecName: "kube-api-access-8dbbx") pod "8b19cc0d-3846-4242-8991-fbe787cc9680" (UID: "8b19cc0d-3846-4242-8991-fbe787cc9680"). InnerVolumeSpecName "kube-api-access-8dbbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.286494 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b19cc0d-3846-4242-8991-fbe787cc9680-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8b19cc0d-3846-4242-8991-fbe787cc9680" (UID: "8b19cc0d-3846-4242-8991-fbe787cc9680"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.380782 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8b19cc0d-3846-4242-8991-fbe787cc9680-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.380835 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8b19cc0d-3846-4242-8991-fbe787cc9680-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.380857 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dbbx\" (UniqueName: \"kubernetes.io/projected/8b19cc0d-3846-4242-8991-fbe787cc9680-kube-api-access-8dbbx\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.885033 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" event={"ID":"8b19cc0d-3846-4242-8991-fbe787cc9680","Type":"ContainerDied","Data":"06dd9557295f28b7e8bfc0d95e692c3ed90a2569cc9477d3087e81f89ce0de89"} Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.885081 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06dd9557295f28b7e8bfc0d95e692c3ed90a2569cc9477d3087e81f89ce0de89" Oct 03 10:15:03 crc kubenswrapper[4990]: I1003 10:15:03.885106 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d" Oct 03 10:16:55 crc kubenswrapper[4990]: I1003 10:16:55.303651 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:16:55 crc kubenswrapper[4990]: I1003 10:16:55.304487 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.083167 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g7pg9"] Oct 03 10:16:58 crc kubenswrapper[4990]: E1003 10:16:58.083564 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b19cc0d-3846-4242-8991-fbe787cc9680" containerName="collect-profiles" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.083582 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b19cc0d-3846-4242-8991-fbe787cc9680" containerName="collect-profiles" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.083788 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b19cc0d-3846-4242-8991-fbe787cc9680" containerName="collect-profiles" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.085062 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.101502 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7pg9"] Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.189850 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfsbr\" (UniqueName: \"kubernetes.io/projected/37dcf04a-496b-47de-885a-82eea8c4a9ec-kube-api-access-bfsbr\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.189933 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-utilities\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.189961 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-catalog-content\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.291154 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfsbr\" (UniqueName: \"kubernetes.io/projected/37dcf04a-496b-47de-885a-82eea8c4a9ec-kube-api-access-bfsbr\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.291216 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-utilities\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.291235 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-catalog-content\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.291755 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-catalog-content\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.291834 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-utilities\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.319710 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfsbr\" (UniqueName: \"kubernetes.io/projected/37dcf04a-496b-47de-885a-82eea8c4a9ec-kube-api-access-bfsbr\") pod \"redhat-operators-g7pg9\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.411986 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:16:58 crc kubenswrapper[4990]: I1003 10:16:58.862088 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7pg9"] Oct 03 10:16:58 crc kubenswrapper[4990]: W1003 10:16:58.868606 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37dcf04a_496b_47de_885a_82eea8c4a9ec.slice/crio-5ac03ffc177d65594dabdc9739527d203d9659658205abdf33d76b596193e298 WatchSource:0}: Error finding container 5ac03ffc177d65594dabdc9739527d203d9659658205abdf33d76b596193e298: Status 404 returned error can't find the container with id 5ac03ffc177d65594dabdc9739527d203d9659658205abdf33d76b596193e298 Oct 03 10:16:59 crc kubenswrapper[4990]: I1003 10:16:59.865358 4990 generic.go:334] "Generic (PLEG): container finished" podID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerID="b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df" exitCode=0 Oct 03 10:16:59 crc kubenswrapper[4990]: I1003 10:16:59.865442 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7pg9" event={"ID":"37dcf04a-496b-47de-885a-82eea8c4a9ec","Type":"ContainerDied","Data":"b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df"} Oct 03 10:16:59 crc kubenswrapper[4990]: I1003 10:16:59.865488 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7pg9" event={"ID":"37dcf04a-496b-47de-885a-82eea8c4a9ec","Type":"ContainerStarted","Data":"5ac03ffc177d65594dabdc9739527d203d9659658205abdf33d76b596193e298"} Oct 03 10:16:59 crc kubenswrapper[4990]: I1003 10:16:59.868235 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:17:01 crc kubenswrapper[4990]: I1003 10:17:01.889486 4990 generic.go:334] "Generic (PLEG): container finished" podID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerID="a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb" exitCode=0 Oct 03 10:17:01 crc kubenswrapper[4990]: I1003 10:17:01.889692 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7pg9" event={"ID":"37dcf04a-496b-47de-885a-82eea8c4a9ec","Type":"ContainerDied","Data":"a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb"} Oct 03 10:17:02 crc kubenswrapper[4990]: I1003 10:17:02.900583 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7pg9" event={"ID":"37dcf04a-496b-47de-885a-82eea8c4a9ec","Type":"ContainerStarted","Data":"a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0"} Oct 03 10:17:02 crc kubenswrapper[4990]: I1003 10:17:02.924078 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g7pg9" podStartSLOduration=2.474651443 podStartE2EDuration="4.92405783s" podCreationTimestamp="2025-10-03 10:16:58 +0000 UTC" firstStartedPulling="2025-10-03 10:16:59.86795692 +0000 UTC m=+2001.664588777" lastFinishedPulling="2025-10-03 10:17:02.317363307 +0000 UTC m=+2004.113995164" observedRunningTime="2025-10-03 10:17:02.918912426 +0000 UTC m=+2004.715544303" watchObservedRunningTime="2025-10-03 10:17:02.92405783 +0000 UTC m=+2004.720689697" Oct 03 10:17:08 crc kubenswrapper[4990]: I1003 10:17:08.412476 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:17:08 crc kubenswrapper[4990]: I1003 10:17:08.412973 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:17:08 crc kubenswrapper[4990]: I1003 10:17:08.461338 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:17:08 crc kubenswrapper[4990]: I1003 10:17:08.987821 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:17:09 crc kubenswrapper[4990]: I1003 10:17:09.034026 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7pg9"] Oct 03 10:17:10 crc kubenswrapper[4990]: I1003 10:17:10.953966 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g7pg9" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerName="registry-server" containerID="cri-o://a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0" gracePeriod=2 Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.385827 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.407893 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-utilities\") pod \"37dcf04a-496b-47de-885a-82eea8c4a9ec\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.408041 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfsbr\" (UniqueName: \"kubernetes.io/projected/37dcf04a-496b-47de-885a-82eea8c4a9ec-kube-api-access-bfsbr\") pod \"37dcf04a-496b-47de-885a-82eea8c4a9ec\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.408804 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-catalog-content\") pod \"37dcf04a-496b-47de-885a-82eea8c4a9ec\" (UID: \"37dcf04a-496b-47de-885a-82eea8c4a9ec\") " Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.410218 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-utilities" (OuterVolumeSpecName: "utilities") pod "37dcf04a-496b-47de-885a-82eea8c4a9ec" (UID: "37dcf04a-496b-47de-885a-82eea8c4a9ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.414284 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37dcf04a-496b-47de-885a-82eea8c4a9ec-kube-api-access-bfsbr" (OuterVolumeSpecName: "kube-api-access-bfsbr") pod "37dcf04a-496b-47de-885a-82eea8c4a9ec" (UID: "37dcf04a-496b-47de-885a-82eea8c4a9ec"). InnerVolumeSpecName "kube-api-access-bfsbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.511156 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.511195 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfsbr\" (UniqueName: \"kubernetes.io/projected/37dcf04a-496b-47de-885a-82eea8c4a9ec-kube-api-access-bfsbr\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.530179 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37dcf04a-496b-47de-885a-82eea8c4a9ec" (UID: "37dcf04a-496b-47de-885a-82eea8c4a9ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.612478 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37dcf04a-496b-47de-885a-82eea8c4a9ec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.966640 4990 generic.go:334] "Generic (PLEG): container finished" podID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerID="a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0" exitCode=0 Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.966696 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7pg9" event={"ID":"37dcf04a-496b-47de-885a-82eea8c4a9ec","Type":"ContainerDied","Data":"a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0"} Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.966725 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7pg9" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.966749 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7pg9" event={"ID":"37dcf04a-496b-47de-885a-82eea8c4a9ec","Type":"ContainerDied","Data":"5ac03ffc177d65594dabdc9739527d203d9659658205abdf33d76b596193e298"} Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.966778 4990 scope.go:117] "RemoveContainer" containerID="a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0" Oct 03 10:17:11 crc kubenswrapper[4990]: I1003 10:17:11.987646 4990 scope.go:117] "RemoveContainer" containerID="a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb" Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.021472 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7pg9"] Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.028751 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g7pg9"] Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.033564 4990 scope.go:117] "RemoveContainer" containerID="b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df" Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.055140 4990 scope.go:117] "RemoveContainer" containerID="a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0" Oct 03 10:17:12 crc kubenswrapper[4990]: E1003 10:17:12.055587 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0\": container with ID starting with a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0 not found: ID does not exist" containerID="a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0" Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.055630 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0"} err="failed to get container status \"a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0\": rpc error: code = NotFound desc = could not find container \"a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0\": container with ID starting with a7202a5271d64d311d9a55704822eaf9e17713f295e7d4d29627745f74fb31e0 not found: ID does not exist" Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.055655 4990 scope.go:117] "RemoveContainer" containerID="a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb" Oct 03 10:17:12 crc kubenswrapper[4990]: E1003 10:17:12.056194 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb\": container with ID starting with a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb not found: ID does not exist" containerID="a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb" Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.056236 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb"} err="failed to get container status \"a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb\": rpc error: code = NotFound desc = could not find container \"a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb\": container with ID starting with a2fb91580a4e8580446156b0bcd3c2cc39c259b4e45ef2b23190f4f62904ecfb not found: ID does not exist" Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.056264 4990 scope.go:117] "RemoveContainer" containerID="b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df" Oct 03 10:17:12 crc kubenswrapper[4990]: E1003 10:17:12.056614 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df\": container with ID starting with b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df not found: ID does not exist" containerID="b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df" Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.056642 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df"} err="failed to get container status \"b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df\": rpc error: code = NotFound desc = could not find container \"b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df\": container with ID starting with b1a9d7a0063e6e835a255f6286ab83d2b6cab02f01d41645c7d9cc314ff404df not found: ID does not exist" Oct 03 10:17:12 crc kubenswrapper[4990]: I1003 10:17:12.879779 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" path="/var/lib/kubelet/pods/37dcf04a-496b-47de-885a-82eea8c4a9ec/volumes" Oct 03 10:17:25 crc kubenswrapper[4990]: I1003 10:17:25.304188 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:17:25 crc kubenswrapper[4990]: I1003 10:17:25.305028 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.476389 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m7n9m"] Oct 03 10:17:53 crc kubenswrapper[4990]: E1003 10:17:53.477228 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerName="extract-content" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.477243 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerName="extract-content" Oct 03 10:17:53 crc kubenswrapper[4990]: E1003 10:17:53.477255 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerName="registry-server" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.477261 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerName="registry-server" Oct 03 10:17:53 crc kubenswrapper[4990]: E1003 10:17:53.477275 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerName="extract-utilities" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.477282 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerName="extract-utilities" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.477420 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="37dcf04a-496b-47de-885a-82eea8c4a9ec" containerName="registry-server" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.478430 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.495740 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7n9m"] Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.560269 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4c9q\" (UniqueName: \"kubernetes.io/projected/276e9a5a-c2de-45f1-aedf-7ce13df70217-kube-api-access-r4c9q\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.560353 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-utilities\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.560455 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-catalog-content\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.661303 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-utilities\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.661436 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-catalog-content\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.661497 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4c9q\" (UniqueName: \"kubernetes.io/projected/276e9a5a-c2de-45f1-aedf-7ce13df70217-kube-api-access-r4c9q\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.661971 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-utilities\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.661985 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-catalog-content\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.685932 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4c9q\" (UniqueName: \"kubernetes.io/projected/276e9a5a-c2de-45f1-aedf-7ce13df70217-kube-api-access-r4c9q\") pod \"certified-operators-m7n9m\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:53 crc kubenswrapper[4990]: I1003 10:17:53.798858 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:17:54 crc kubenswrapper[4990]: I1003 10:17:54.113029 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m7n9m"] Oct 03 10:17:54 crc kubenswrapper[4990]: I1003 10:17:54.325869 4990 generic.go:334] "Generic (PLEG): container finished" podID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerID="d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4" exitCode=0 Oct 03 10:17:54 crc kubenswrapper[4990]: I1003 10:17:54.325914 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7n9m" event={"ID":"276e9a5a-c2de-45f1-aedf-7ce13df70217","Type":"ContainerDied","Data":"d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4"} Oct 03 10:17:54 crc kubenswrapper[4990]: I1003 10:17:54.325942 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7n9m" event={"ID":"276e9a5a-c2de-45f1-aedf-7ce13df70217","Type":"ContainerStarted","Data":"caf980636072f361875bb42d2d126417e6723527a7f49bb35c5098c035534231"} Oct 03 10:17:55 crc kubenswrapper[4990]: I1003 10:17:55.304056 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:17:55 crc kubenswrapper[4990]: I1003 10:17:55.304338 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:17:55 crc kubenswrapper[4990]: I1003 10:17:55.304383 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:17:55 crc kubenswrapper[4990]: I1003 10:17:55.304899 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2acf6458a6d39b1d71bee331b01be7fecf262830a0ef956a66f563795081966"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:17:55 crc kubenswrapper[4990]: I1003 10:17:55.304956 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://b2acf6458a6d39b1d71bee331b01be7fecf262830a0ef956a66f563795081966" gracePeriod=600 Oct 03 10:17:55 crc kubenswrapper[4990]: I1003 10:17:55.334558 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7n9m" event={"ID":"276e9a5a-c2de-45f1-aedf-7ce13df70217","Type":"ContainerStarted","Data":"c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec"} Oct 03 10:17:56 crc kubenswrapper[4990]: I1003 10:17:56.346106 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="b2acf6458a6d39b1d71bee331b01be7fecf262830a0ef956a66f563795081966" exitCode=0 Oct 03 10:17:56 crc kubenswrapper[4990]: I1003 10:17:56.346185 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"b2acf6458a6d39b1d71bee331b01be7fecf262830a0ef956a66f563795081966"} Oct 03 10:17:56 crc kubenswrapper[4990]: I1003 10:17:56.346628 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec"} Oct 03 10:17:56 crc kubenswrapper[4990]: I1003 10:17:56.346651 4990 scope.go:117] "RemoveContainer" containerID="7f3aae7502069a68e04b6e14cb6af71fa0576e0f9fc60f290433607d69393f71" Oct 03 10:17:56 crc kubenswrapper[4990]: I1003 10:17:56.350529 4990 generic.go:334] "Generic (PLEG): container finished" podID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerID="c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec" exitCode=0 Oct 03 10:17:56 crc kubenswrapper[4990]: I1003 10:17:56.350580 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7n9m" event={"ID":"276e9a5a-c2de-45f1-aedf-7ce13df70217","Type":"ContainerDied","Data":"c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec"} Oct 03 10:17:57 crc kubenswrapper[4990]: I1003 10:17:57.362501 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7n9m" event={"ID":"276e9a5a-c2de-45f1-aedf-7ce13df70217","Type":"ContainerStarted","Data":"847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d"} Oct 03 10:17:57 crc kubenswrapper[4990]: I1003 10:17:57.383405 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m7n9m" podStartSLOduration=1.92718764 podStartE2EDuration="4.383387775s" podCreationTimestamp="2025-10-03 10:17:53 +0000 UTC" firstStartedPulling="2025-10-03 10:17:54.32728387 +0000 UTC m=+2056.123915727" lastFinishedPulling="2025-10-03 10:17:56.783484005 +0000 UTC m=+2058.580115862" observedRunningTime="2025-10-03 10:17:57.38165245 +0000 UTC m=+2059.178284327" watchObservedRunningTime="2025-10-03 10:17:57.383387775 +0000 UTC m=+2059.180019632" Oct 03 10:18:03 crc kubenswrapper[4990]: I1003 10:18:03.799116 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:18:03 crc kubenswrapper[4990]: I1003 10:18:03.799860 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:18:03 crc kubenswrapper[4990]: I1003 10:18:03.857459 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:18:04 crc kubenswrapper[4990]: I1003 10:18:04.464277 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:18:04 crc kubenswrapper[4990]: I1003 10:18:04.517388 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7n9m"] Oct 03 10:18:06 crc kubenswrapper[4990]: I1003 10:18:06.428684 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m7n9m" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerName="registry-server" containerID="cri-o://847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d" gracePeriod=2 Oct 03 10:18:06 crc kubenswrapper[4990]: I1003 10:18:06.873886 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:18:06 crc kubenswrapper[4990]: I1003 10:18:06.977197 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-utilities\") pod \"276e9a5a-c2de-45f1-aedf-7ce13df70217\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " Oct 03 10:18:06 crc kubenswrapper[4990]: I1003 10:18:06.977248 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-catalog-content\") pod \"276e9a5a-c2de-45f1-aedf-7ce13df70217\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " Oct 03 10:18:06 crc kubenswrapper[4990]: I1003 10:18:06.977319 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4c9q\" (UniqueName: \"kubernetes.io/projected/276e9a5a-c2de-45f1-aedf-7ce13df70217-kube-api-access-r4c9q\") pod \"276e9a5a-c2de-45f1-aedf-7ce13df70217\" (UID: \"276e9a5a-c2de-45f1-aedf-7ce13df70217\") " Oct 03 10:18:06 crc kubenswrapper[4990]: I1003 10:18:06.978172 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-utilities" (OuterVolumeSpecName: "utilities") pod "276e9a5a-c2de-45f1-aedf-7ce13df70217" (UID: "276e9a5a-c2de-45f1-aedf-7ce13df70217"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:18:06 crc kubenswrapper[4990]: I1003 10:18:06.984911 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/276e9a5a-c2de-45f1-aedf-7ce13df70217-kube-api-access-r4c9q" (OuterVolumeSpecName: "kube-api-access-r4c9q") pod "276e9a5a-c2de-45f1-aedf-7ce13df70217" (UID: "276e9a5a-c2de-45f1-aedf-7ce13df70217"). InnerVolumeSpecName "kube-api-access-r4c9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.021901 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "276e9a5a-c2de-45f1-aedf-7ce13df70217" (UID: "276e9a5a-c2de-45f1-aedf-7ce13df70217"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.078559 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4c9q\" (UniqueName: \"kubernetes.io/projected/276e9a5a-c2de-45f1-aedf-7ce13df70217-kube-api-access-r4c9q\") on node \"crc\" DevicePath \"\"" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.078586 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.078595 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276e9a5a-c2de-45f1-aedf-7ce13df70217-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.437373 4990 generic.go:334] "Generic (PLEG): container finished" podID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerID="847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d" exitCode=0 Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.437428 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m7n9m" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.437417 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7n9m" event={"ID":"276e9a5a-c2de-45f1-aedf-7ce13df70217","Type":"ContainerDied","Data":"847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d"} Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.437490 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m7n9m" event={"ID":"276e9a5a-c2de-45f1-aedf-7ce13df70217","Type":"ContainerDied","Data":"caf980636072f361875bb42d2d126417e6723527a7f49bb35c5098c035534231"} Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.437531 4990 scope.go:117] "RemoveContainer" containerID="847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.458231 4990 scope.go:117] "RemoveContainer" containerID="c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.474723 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m7n9m"] Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.479473 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m7n9m"] Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.488800 4990 scope.go:117] "RemoveContainer" containerID="d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.532540 4990 scope.go:117] "RemoveContainer" containerID="847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d" Oct 03 10:18:07 crc kubenswrapper[4990]: E1003 10:18:07.533318 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d\": container with ID starting with 847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d not found: ID does not exist" containerID="847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.533368 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d"} err="failed to get container status \"847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d\": rpc error: code = NotFound desc = could not find container \"847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d\": container with ID starting with 847ad4001c356c2e1bf07ddebe7090ac01b8d6096df68f4f4bcf144cc179fd2d not found: ID does not exist" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.533398 4990 scope.go:117] "RemoveContainer" containerID="c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec" Oct 03 10:18:07 crc kubenswrapper[4990]: E1003 10:18:07.533886 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec\": container with ID starting with c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec not found: ID does not exist" containerID="c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.533907 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec"} err="failed to get container status \"c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec\": rpc error: code = NotFound desc = could not find container \"c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec\": container with ID starting with c8707fe744c3e7135e561e248c31cad3461bda411fd30edd6c65a0372785c1ec not found: ID does not exist" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.533925 4990 scope.go:117] "RemoveContainer" containerID="d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4" Oct 03 10:18:07 crc kubenswrapper[4990]: E1003 10:18:07.534325 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4\": container with ID starting with d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4 not found: ID does not exist" containerID="d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4" Oct 03 10:18:07 crc kubenswrapper[4990]: I1003 10:18:07.534346 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4"} err="failed to get container status \"d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4\": rpc error: code = NotFound desc = could not find container \"d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4\": container with ID starting with d3a3d25cbc44f67e33df9361cbf3e8fb773279f6cf8630c54b0e509eb85e47c4 not found: ID does not exist" Oct 03 10:18:08 crc kubenswrapper[4990]: I1003 10:18:08.880171 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" path="/var/lib/kubelet/pods/276e9a5a-c2de-45f1-aedf-7ce13df70217/volumes" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.323352 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pdt6v"] Oct 03 10:18:34 crc kubenswrapper[4990]: E1003 10:18:34.324833 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerName="registry-server" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.324871 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerName="registry-server" Oct 03 10:18:34 crc kubenswrapper[4990]: E1003 10:18:34.324909 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerName="extract-utilities" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.324931 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerName="extract-utilities" Oct 03 10:18:34 crc kubenswrapper[4990]: E1003 10:18:34.324973 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerName="extract-content" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.324992 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerName="extract-content" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.325410 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="276e9a5a-c2de-45f1-aedf-7ce13df70217" containerName="registry-server" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.328414 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.339808 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdt6v"] Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.482636 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-catalog-content\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.482709 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgbpz\" (UniqueName: \"kubernetes.io/projected/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-kube-api-access-fgbpz\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.482800 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-utilities\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.584370 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-catalog-content\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.584429 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgbpz\" (UniqueName: \"kubernetes.io/projected/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-kube-api-access-fgbpz\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.584494 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-utilities\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.585059 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-catalog-content\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.585059 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-utilities\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.610381 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgbpz\" (UniqueName: \"kubernetes.io/projected/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-kube-api-access-fgbpz\") pod \"redhat-marketplace-pdt6v\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:34 crc kubenswrapper[4990]: I1003 10:18:34.693222 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:35 crc kubenswrapper[4990]: I1003 10:18:35.101951 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdt6v"] Oct 03 10:18:35 crc kubenswrapper[4990]: W1003 10:18:35.114379 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6413d99_af6b_4dd5_9b0f_e27e3ce3c759.slice/crio-fc765ada6d8a7280a2ed05735a8919944597587c63c536ed749e993cdf841596 WatchSource:0}: Error finding container fc765ada6d8a7280a2ed05735a8919944597587c63c536ed749e993cdf841596: Status 404 returned error can't find the container with id fc765ada6d8a7280a2ed05735a8919944597587c63c536ed749e993cdf841596 Oct 03 10:18:35 crc kubenswrapper[4990]: I1003 10:18:35.688086 4990 generic.go:334] "Generic (PLEG): container finished" podID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerID="4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436" exitCode=0 Oct 03 10:18:35 crc kubenswrapper[4990]: I1003 10:18:35.688176 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdt6v" event={"ID":"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759","Type":"ContainerDied","Data":"4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436"} Oct 03 10:18:35 crc kubenswrapper[4990]: I1003 10:18:35.688230 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdt6v" event={"ID":"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759","Type":"ContainerStarted","Data":"fc765ada6d8a7280a2ed05735a8919944597587c63c536ed749e993cdf841596"} Oct 03 10:18:36 crc kubenswrapper[4990]: I1003 10:18:36.697877 4990 generic.go:334] "Generic (PLEG): container finished" podID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerID="fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea" exitCode=0 Oct 03 10:18:36 crc kubenswrapper[4990]: I1003 10:18:36.698087 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdt6v" event={"ID":"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759","Type":"ContainerDied","Data":"fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea"} Oct 03 10:18:37 crc kubenswrapper[4990]: I1003 10:18:37.708091 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdt6v" event={"ID":"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759","Type":"ContainerStarted","Data":"ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45"} Oct 03 10:18:37 crc kubenswrapper[4990]: I1003 10:18:37.726630 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pdt6v" podStartSLOduration=2.285728022 podStartE2EDuration="3.726612938s" podCreationTimestamp="2025-10-03 10:18:34 +0000 UTC" firstStartedPulling="2025-10-03 10:18:35.690178173 +0000 UTC m=+2097.486810040" lastFinishedPulling="2025-10-03 10:18:37.131063099 +0000 UTC m=+2098.927694956" observedRunningTime="2025-10-03 10:18:37.725312494 +0000 UTC m=+2099.521944401" watchObservedRunningTime="2025-10-03 10:18:37.726612938 +0000 UTC m=+2099.523244795" Oct 03 10:18:44 crc kubenswrapper[4990]: I1003 10:18:44.694065 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:44 crc kubenswrapper[4990]: I1003 10:18:44.694586 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:44 crc kubenswrapper[4990]: I1003 10:18:44.771402 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:44 crc kubenswrapper[4990]: I1003 10:18:44.817559 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:45 crc kubenswrapper[4990]: I1003 10:18:45.014178 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdt6v"] Oct 03 10:18:46 crc kubenswrapper[4990]: I1003 10:18:46.773935 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pdt6v" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerName="registry-server" containerID="cri-o://ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45" gracePeriod=2 Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.155883 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.180157 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-utilities\") pod \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.181421 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-utilities" (OuterVolumeSpecName: "utilities") pod "e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" (UID: "e6413d99-af6b-4dd5-9b0f-e27e3ce3c759"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.281616 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgbpz\" (UniqueName: \"kubernetes.io/projected/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-kube-api-access-fgbpz\") pod \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.281680 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-catalog-content\") pod \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\" (UID: \"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759\") " Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.282043 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.289996 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-kube-api-access-fgbpz" (OuterVolumeSpecName: "kube-api-access-fgbpz") pod "e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" (UID: "e6413d99-af6b-4dd5-9b0f-e27e3ce3c759"). InnerVolumeSpecName "kube-api-access-fgbpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.294609 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" (UID: "e6413d99-af6b-4dd5-9b0f-e27e3ce3c759"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.382813 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgbpz\" (UniqueName: \"kubernetes.io/projected/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-kube-api-access-fgbpz\") on node \"crc\" DevicePath \"\"" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.382849 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.782207 4990 generic.go:334] "Generic (PLEG): container finished" podID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerID="ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45" exitCode=0 Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.782258 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdt6v" event={"ID":"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759","Type":"ContainerDied","Data":"ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45"} Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.782288 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdt6v" event={"ID":"e6413d99-af6b-4dd5-9b0f-e27e3ce3c759","Type":"ContainerDied","Data":"fc765ada6d8a7280a2ed05735a8919944597587c63c536ed749e993cdf841596"} Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.782309 4990 scope.go:117] "RemoveContainer" containerID="ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.782431 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdt6v" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.800207 4990 scope.go:117] "RemoveContainer" containerID="fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.816704 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdt6v"] Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.822003 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdt6v"] Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.834732 4990 scope.go:117] "RemoveContainer" containerID="4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.848962 4990 scope.go:117] "RemoveContainer" containerID="ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45" Oct 03 10:18:47 crc kubenswrapper[4990]: E1003 10:18:47.849472 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45\": container with ID starting with ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45 not found: ID does not exist" containerID="ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.849528 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45"} err="failed to get container status \"ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45\": rpc error: code = NotFound desc = could not find container \"ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45\": container with ID starting with ea4168798630a9bd7f10079059d928e90979a9c335e14400d6716ee9362b7b45 not found: ID does not exist" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.849557 4990 scope.go:117] "RemoveContainer" containerID="fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea" Oct 03 10:18:47 crc kubenswrapper[4990]: E1003 10:18:47.849922 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea\": container with ID starting with fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea not found: ID does not exist" containerID="fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.849967 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea"} err="failed to get container status \"fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea\": rpc error: code = NotFound desc = could not find container \"fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea\": container with ID starting with fab8e899b9fc9d97f8e596aba3528a13c5c72038c86971dea7ab80e10df5d9ea not found: ID does not exist" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.849996 4990 scope.go:117] "RemoveContainer" containerID="4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436" Oct 03 10:18:47 crc kubenswrapper[4990]: E1003 10:18:47.850265 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436\": container with ID starting with 4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436 not found: ID does not exist" containerID="4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436" Oct 03 10:18:47 crc kubenswrapper[4990]: I1003 10:18:47.850320 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436"} err="failed to get container status \"4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436\": rpc error: code = NotFound desc = could not find container \"4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436\": container with ID starting with 4f21c65ee0d9e983252391685c42c8a3120b5f6306ffe7ce6934872e9300c436 not found: ID does not exist" Oct 03 10:18:48 crc kubenswrapper[4990]: I1003 10:18:48.881829 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" path="/var/lib/kubelet/pods/e6413d99-af6b-4dd5-9b0f-e27e3ce3c759/volumes" Oct 03 10:19:55 crc kubenswrapper[4990]: I1003 10:19:55.304118 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:19:55 crc kubenswrapper[4990]: I1003 10:19:55.304851 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:20:25 crc kubenswrapper[4990]: I1003 10:20:25.304553 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:20:25 crc kubenswrapper[4990]: I1003 10:20:25.305132 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.304283 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.304965 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.305026 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.305899 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.305971 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" gracePeriod=600 Oct 03 10:20:55 crc kubenswrapper[4990]: E1003 10:20:55.430455 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.838476 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" exitCode=0 Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.838555 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec"} Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.838602 4990 scope.go:117] "RemoveContainer" containerID="b2acf6458a6d39b1d71bee331b01be7fecf262830a0ef956a66f563795081966" Oct 03 10:20:55 crc kubenswrapper[4990]: I1003 10:20:55.838987 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:20:55 crc kubenswrapper[4990]: E1003 10:20:55.839407 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:21:08 crc kubenswrapper[4990]: I1003 10:21:08.893383 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:21:08 crc kubenswrapper[4990]: E1003 10:21:08.894927 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:21:23 crc kubenswrapper[4990]: I1003 10:21:23.871830 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:21:23 crc kubenswrapper[4990]: E1003 10:21:23.872603 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.552640 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qg6mz"] Oct 03 10:21:32 crc kubenswrapper[4990]: E1003 10:21:32.553800 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerName="registry-server" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.553822 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerName="registry-server" Oct 03 10:21:32 crc kubenswrapper[4990]: E1003 10:21:32.553845 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerName="extract-content" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.553855 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerName="extract-content" Oct 03 10:21:32 crc kubenswrapper[4990]: E1003 10:21:32.553866 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerName="extract-utilities" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.553877 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerName="extract-utilities" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.554135 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6413d99-af6b-4dd5-9b0f-e27e3ce3c759" containerName="registry-server" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.555804 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.567267 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qg6mz"] Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.671382 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-catalog-content\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.671474 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-utilities\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.671594 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49kv6\" (UniqueName: \"kubernetes.io/projected/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-kube-api-access-49kv6\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.773126 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-catalog-content\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.773204 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-utilities\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.773233 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49kv6\" (UniqueName: \"kubernetes.io/projected/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-kube-api-access-49kv6\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.774060 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-catalog-content\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.774117 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-utilities\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.794290 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49kv6\" (UniqueName: \"kubernetes.io/projected/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-kube-api-access-49kv6\") pod \"community-operators-qg6mz\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:32 crc kubenswrapper[4990]: I1003 10:21:32.873627 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:33 crc kubenswrapper[4990]: I1003 10:21:33.170445 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qg6mz"] Oct 03 10:21:34 crc kubenswrapper[4990]: I1003 10:21:34.156812 4990 generic.go:334] "Generic (PLEG): container finished" podID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerID="4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5" exitCode=0 Oct 03 10:21:34 crc kubenswrapper[4990]: I1003 10:21:34.156923 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg6mz" event={"ID":"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd","Type":"ContainerDied","Data":"4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5"} Oct 03 10:21:34 crc kubenswrapper[4990]: I1003 10:21:34.157126 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg6mz" event={"ID":"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd","Type":"ContainerStarted","Data":"44b5d4ee6cdb96baa8b2972c119cb44efc79faf79726ed055cd6c1e35fb14c59"} Oct 03 10:21:36 crc kubenswrapper[4990]: I1003 10:21:36.174649 4990 generic.go:334] "Generic (PLEG): container finished" podID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerID="df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2" exitCode=0 Oct 03 10:21:36 crc kubenswrapper[4990]: I1003 10:21:36.174753 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg6mz" event={"ID":"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd","Type":"ContainerDied","Data":"df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2"} Oct 03 10:21:37 crc kubenswrapper[4990]: I1003 10:21:37.186472 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg6mz" event={"ID":"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd","Type":"ContainerStarted","Data":"e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b"} Oct 03 10:21:38 crc kubenswrapper[4990]: I1003 10:21:38.879051 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:21:38 crc kubenswrapper[4990]: E1003 10:21:38.879950 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:21:42 crc kubenswrapper[4990]: I1003 10:21:42.889670 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:42 crc kubenswrapper[4990]: I1003 10:21:42.890437 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:42 crc kubenswrapper[4990]: I1003 10:21:42.940954 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:42 crc kubenswrapper[4990]: I1003 10:21:42.965493 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qg6mz" podStartSLOduration=8.538701171 podStartE2EDuration="10.965471778s" podCreationTimestamp="2025-10-03 10:21:32 +0000 UTC" firstStartedPulling="2025-10-03 10:21:34.158904373 +0000 UTC m=+2275.955536230" lastFinishedPulling="2025-10-03 10:21:36.58567498 +0000 UTC m=+2278.382306837" observedRunningTime="2025-10-03 10:21:37.210123562 +0000 UTC m=+2279.006755449" watchObservedRunningTime="2025-10-03 10:21:42.965471778 +0000 UTC m=+2284.762103665" Oct 03 10:21:43 crc kubenswrapper[4990]: I1003 10:21:43.284065 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:45 crc kubenswrapper[4990]: I1003 10:21:45.738784 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qg6mz"] Oct 03 10:21:45 crc kubenswrapper[4990]: I1003 10:21:45.739420 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qg6mz" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerName="registry-server" containerID="cri-o://e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b" gracePeriod=2 Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.192283 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.266783 4990 generic.go:334] "Generic (PLEG): container finished" podID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerID="e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b" exitCode=0 Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.266831 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg6mz" event={"ID":"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd","Type":"ContainerDied","Data":"e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b"} Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.266866 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg6mz" event={"ID":"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd","Type":"ContainerDied","Data":"44b5d4ee6cdb96baa8b2972c119cb44efc79faf79726ed055cd6c1e35fb14c59"} Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.266875 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qg6mz" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.266901 4990 scope.go:117] "RemoveContainer" containerID="e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.287473 4990 scope.go:117] "RemoveContainer" containerID="df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.305531 4990 scope.go:117] "RemoveContainer" containerID="4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.327523 4990 scope.go:117] "RemoveContainer" containerID="e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b" Oct 03 10:21:46 crc kubenswrapper[4990]: E1003 10:21:46.328081 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b\": container with ID starting with e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b not found: ID does not exist" containerID="e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.328152 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b"} err="failed to get container status \"e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b\": rpc error: code = NotFound desc = could not find container \"e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b\": container with ID starting with e73de4ea687cb56c2f4592bcfd8a9d6dac6c3073ed2c287f17db084d9a3b758b not found: ID does not exist" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.328181 4990 scope.go:117] "RemoveContainer" containerID="df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2" Oct 03 10:21:46 crc kubenswrapper[4990]: E1003 10:21:46.328438 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2\": container with ID starting with df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2 not found: ID does not exist" containerID="df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.328470 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2"} err="failed to get container status \"df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2\": rpc error: code = NotFound desc = could not find container \"df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2\": container with ID starting with df7eb4fc7e34a81d9c7ea647787e6de03f9a44cf8c8e599b53b1809c120dd9b2 not found: ID does not exist" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.328490 4990 scope.go:117] "RemoveContainer" containerID="4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5" Oct 03 10:21:46 crc kubenswrapper[4990]: E1003 10:21:46.328803 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5\": container with ID starting with 4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5 not found: ID does not exist" containerID="4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.328828 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5"} err="failed to get container status \"4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5\": rpc error: code = NotFound desc = could not find container \"4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5\": container with ID starting with 4c9f7a9c7e839d8a7f85bb2aa8fe09d8914718470e75652ad62b44487e5cdea5 not found: ID does not exist" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.382459 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-catalog-content\") pod \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.382706 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49kv6\" (UniqueName: \"kubernetes.io/projected/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-kube-api-access-49kv6\") pod \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.382793 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-utilities\") pod \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\" (UID: \"f0f50c80-cf8a-46b7-baef-ba407bc4e1fd\") " Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.383403 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-utilities" (OuterVolumeSpecName: "utilities") pod "f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" (UID: "f0f50c80-cf8a-46b7-baef-ba407bc4e1fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.388219 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-kube-api-access-49kv6" (OuterVolumeSpecName: "kube-api-access-49kv6") pod "f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" (UID: "f0f50c80-cf8a-46b7-baef-ba407bc4e1fd"). InnerVolumeSpecName "kube-api-access-49kv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.429976 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" (UID: "f0f50c80-cf8a-46b7-baef-ba407bc4e1fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.484749 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49kv6\" (UniqueName: \"kubernetes.io/projected/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-kube-api-access-49kv6\") on node \"crc\" DevicePath \"\"" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.484805 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.484825 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.593879 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qg6mz"] Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.599338 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qg6mz"] Oct 03 10:21:46 crc kubenswrapper[4990]: I1003 10:21:46.882272 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" path="/var/lib/kubelet/pods/f0f50c80-cf8a-46b7-baef-ba407bc4e1fd/volumes" Oct 03 10:21:52 crc kubenswrapper[4990]: I1003 10:21:52.872276 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:21:52 crc kubenswrapper[4990]: E1003 10:21:52.873054 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:22:03 crc kubenswrapper[4990]: I1003 10:22:03.871749 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:22:03 crc kubenswrapper[4990]: E1003 10:22:03.872475 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:22:17 crc kubenswrapper[4990]: I1003 10:22:17.871934 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:22:17 crc kubenswrapper[4990]: E1003 10:22:17.872778 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:22:32 crc kubenswrapper[4990]: I1003 10:22:32.872248 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:22:32 crc kubenswrapper[4990]: E1003 10:22:32.873186 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:22:47 crc kubenswrapper[4990]: I1003 10:22:47.872220 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:22:47 crc kubenswrapper[4990]: E1003 10:22:47.872991 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:23:01 crc kubenswrapper[4990]: I1003 10:23:01.872701 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:23:01 crc kubenswrapper[4990]: E1003 10:23:01.873447 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:23:12 crc kubenswrapper[4990]: I1003 10:23:12.871777 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:23:12 crc kubenswrapper[4990]: E1003 10:23:12.872442 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:23:25 crc kubenswrapper[4990]: I1003 10:23:25.871868 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:23:25 crc kubenswrapper[4990]: E1003 10:23:25.872633 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:23:37 crc kubenswrapper[4990]: I1003 10:23:37.871556 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:23:37 crc kubenswrapper[4990]: E1003 10:23:37.873852 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:23:51 crc kubenswrapper[4990]: I1003 10:23:51.872492 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:23:51 crc kubenswrapper[4990]: E1003 10:23:51.873251 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:24:02 crc kubenswrapper[4990]: I1003 10:24:02.873146 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:24:02 crc kubenswrapper[4990]: E1003 10:24:02.874225 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:24:16 crc kubenswrapper[4990]: I1003 10:24:16.872039 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:24:16 crc kubenswrapper[4990]: E1003 10:24:16.872865 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:24:28 crc kubenswrapper[4990]: I1003 10:24:28.878554 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:24:28 crc kubenswrapper[4990]: E1003 10:24:28.880382 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:24:43 crc kubenswrapper[4990]: I1003 10:24:43.871686 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:24:43 crc kubenswrapper[4990]: E1003 10:24:43.872335 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:24:58 crc kubenswrapper[4990]: I1003 10:24:58.883283 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:24:58 crc kubenswrapper[4990]: E1003 10:24:58.884369 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:25:12 crc kubenswrapper[4990]: I1003 10:25:12.872245 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:25:12 crc kubenswrapper[4990]: E1003 10:25:12.875631 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:25:23 crc kubenswrapper[4990]: I1003 10:25:23.871406 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:25:23 crc kubenswrapper[4990]: E1003 10:25:23.872293 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:25:35 crc kubenswrapper[4990]: I1003 10:25:35.871719 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:25:35 crc kubenswrapper[4990]: E1003 10:25:35.872843 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:25:48 crc kubenswrapper[4990]: I1003 10:25:48.881329 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:25:48 crc kubenswrapper[4990]: E1003 10:25:48.882785 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:26:00 crc kubenswrapper[4990]: I1003 10:26:00.872064 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:26:01 crc kubenswrapper[4990]: I1003 10:26:01.409860 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"6428793fc9e9b192a4c52b9248cddfc26bee246289664fe7e8acbb51dfd2a0d2"} Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.686483 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pz247"] Oct 03 10:28:04 crc kubenswrapper[4990]: E1003 10:28:04.687550 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerName="extract-utilities" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.687571 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerName="extract-utilities" Oct 03 10:28:04 crc kubenswrapper[4990]: E1003 10:28:04.687592 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerName="registry-server" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.687603 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerName="registry-server" Oct 03 10:28:04 crc kubenswrapper[4990]: E1003 10:28:04.687648 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerName="extract-content" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.687659 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerName="extract-content" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.687895 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f50c80-cf8a-46b7-baef-ba407bc4e1fd" containerName="registry-server" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.689445 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.694331 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-catalog-content\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.694446 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-utilities\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.694468 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blk84\" (UniqueName: \"kubernetes.io/projected/2c6ccc60-3147-46d7-8b97-40b3268df0d3-kube-api-access-blk84\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.699682 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pz247"] Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.795970 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-utilities\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.796016 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blk84\" (UniqueName: \"kubernetes.io/projected/2c6ccc60-3147-46d7-8b97-40b3268df0d3-kube-api-access-blk84\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.796059 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-catalog-content\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.796687 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-catalog-content\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.796990 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-utilities\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:04 crc kubenswrapper[4990]: I1003 10:28:04.817351 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blk84\" (UniqueName: \"kubernetes.io/projected/2c6ccc60-3147-46d7-8b97-40b3268df0d3-kube-api-access-blk84\") pod \"redhat-operators-pz247\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:05 crc kubenswrapper[4990]: I1003 10:28:05.077435 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:05 crc kubenswrapper[4990]: I1003 10:28:05.503080 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pz247"] Oct 03 10:28:06 crc kubenswrapper[4990]: I1003 10:28:06.470626 4990 generic.go:334] "Generic (PLEG): container finished" podID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerID="9b971d689404286d1efc8ce46a9fa36c41f8940651167895728a006b822efe90" exitCode=0 Oct 03 10:28:06 crc kubenswrapper[4990]: I1003 10:28:06.470764 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz247" event={"ID":"2c6ccc60-3147-46d7-8b97-40b3268df0d3","Type":"ContainerDied","Data":"9b971d689404286d1efc8ce46a9fa36c41f8940651167895728a006b822efe90"} Oct 03 10:28:06 crc kubenswrapper[4990]: I1003 10:28:06.470872 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz247" event={"ID":"2c6ccc60-3147-46d7-8b97-40b3268df0d3","Type":"ContainerStarted","Data":"f1dec80901ec15fd2865579d34c0d61f174483c222be55bc3065601fd1cab20d"} Oct 03 10:28:06 crc kubenswrapper[4990]: I1003 10:28:06.472477 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:28:07 crc kubenswrapper[4990]: I1003 10:28:07.478880 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz247" event={"ID":"2c6ccc60-3147-46d7-8b97-40b3268df0d3","Type":"ContainerStarted","Data":"623a49aa22817e3e8826213d4af92b6a2364c379d4e880a62854c2068352de38"} Oct 03 10:28:08 crc kubenswrapper[4990]: I1003 10:28:08.489206 4990 generic.go:334] "Generic (PLEG): container finished" podID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerID="623a49aa22817e3e8826213d4af92b6a2364c379d4e880a62854c2068352de38" exitCode=0 Oct 03 10:28:08 crc kubenswrapper[4990]: I1003 10:28:08.489388 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz247" event={"ID":"2c6ccc60-3147-46d7-8b97-40b3268df0d3","Type":"ContainerDied","Data":"623a49aa22817e3e8826213d4af92b6a2364c379d4e880a62854c2068352de38"} Oct 03 10:28:10 crc kubenswrapper[4990]: I1003 10:28:10.510503 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz247" event={"ID":"2c6ccc60-3147-46d7-8b97-40b3268df0d3","Type":"ContainerStarted","Data":"d3cf00ed98a8436e23af640bea8b80314939eb9a1a9b5b97f2cedc34725c3af7"} Oct 03 10:28:10 crc kubenswrapper[4990]: I1003 10:28:10.539180 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pz247" podStartSLOduration=3.507599443 podStartE2EDuration="6.539161099s" podCreationTimestamp="2025-10-03 10:28:04 +0000 UTC" firstStartedPulling="2025-10-03 10:28:06.472238737 +0000 UTC m=+2668.268870604" lastFinishedPulling="2025-10-03 10:28:09.503800403 +0000 UTC m=+2671.300432260" observedRunningTime="2025-10-03 10:28:10.533953535 +0000 UTC m=+2672.330585392" watchObservedRunningTime="2025-10-03 10:28:10.539161099 +0000 UTC m=+2672.335792966" Oct 03 10:28:15 crc kubenswrapper[4990]: I1003 10:28:15.078497 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:15 crc kubenswrapper[4990]: I1003 10:28:15.078907 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:15 crc kubenswrapper[4990]: I1003 10:28:15.123662 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:15 crc kubenswrapper[4990]: I1003 10:28:15.598938 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:15 crc kubenswrapper[4990]: I1003 10:28:15.648615 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pz247"] Oct 03 10:28:17 crc kubenswrapper[4990]: I1003 10:28:17.568344 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pz247" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerName="registry-server" containerID="cri-o://d3cf00ed98a8436e23af640bea8b80314939eb9a1a9b5b97f2cedc34725c3af7" gracePeriod=2 Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.585799 4990 generic.go:334] "Generic (PLEG): container finished" podID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerID="d3cf00ed98a8436e23af640bea8b80314939eb9a1a9b5b97f2cedc34725c3af7" exitCode=0 Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.585854 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz247" event={"ID":"2c6ccc60-3147-46d7-8b97-40b3268df0d3","Type":"ContainerDied","Data":"d3cf00ed98a8436e23af640bea8b80314939eb9a1a9b5b97f2cedc34725c3af7"} Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.768126 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.842275 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blk84\" (UniqueName: \"kubernetes.io/projected/2c6ccc60-3147-46d7-8b97-40b3268df0d3-kube-api-access-blk84\") pod \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.843328 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-catalog-content\") pod \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.843464 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-utilities\") pod \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\" (UID: \"2c6ccc60-3147-46d7-8b97-40b3268df0d3\") " Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.844230 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-utilities" (OuterVolumeSpecName: "utilities") pod "2c6ccc60-3147-46d7-8b97-40b3268df0d3" (UID: "2c6ccc60-3147-46d7-8b97-40b3268df0d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.844542 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.850085 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6ccc60-3147-46d7-8b97-40b3268df0d3-kube-api-access-blk84" (OuterVolumeSpecName: "kube-api-access-blk84") pod "2c6ccc60-3147-46d7-8b97-40b3268df0d3" (UID: "2c6ccc60-3147-46d7-8b97-40b3268df0d3"). InnerVolumeSpecName "kube-api-access-blk84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.945825 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blk84\" (UniqueName: \"kubernetes.io/projected/2c6ccc60-3147-46d7-8b97-40b3268df0d3-kube-api-access-blk84\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:19 crc kubenswrapper[4990]: I1003 10:28:19.948607 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c6ccc60-3147-46d7-8b97-40b3268df0d3" (UID: "2c6ccc60-3147-46d7-8b97-40b3268df0d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.047327 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c6ccc60-3147-46d7-8b97-40b3268df0d3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.595437 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pz247" event={"ID":"2c6ccc60-3147-46d7-8b97-40b3268df0d3","Type":"ContainerDied","Data":"f1dec80901ec15fd2865579d34c0d61f174483c222be55bc3065601fd1cab20d"} Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.595503 4990 scope.go:117] "RemoveContainer" containerID="d3cf00ed98a8436e23af640bea8b80314939eb9a1a9b5b97f2cedc34725c3af7" Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.596395 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pz247" Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.617060 4990 scope.go:117] "RemoveContainer" containerID="623a49aa22817e3e8826213d4af92b6a2364c379d4e880a62854c2068352de38" Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.634651 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pz247"] Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.639677 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pz247"] Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.654972 4990 scope.go:117] "RemoveContainer" containerID="9b971d689404286d1efc8ce46a9fa36c41f8940651167895728a006b822efe90" Oct 03 10:28:20 crc kubenswrapper[4990]: I1003 10:28:20.885034 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" path="/var/lib/kubelet/pods/2c6ccc60-3147-46d7-8b97-40b3268df0d3/volumes" Oct 03 10:28:25 crc kubenswrapper[4990]: I1003 10:28:25.304320 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:28:25 crc kubenswrapper[4990]: I1003 10:28:25.304710 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.166591 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c6pp4"] Oct 03 10:28:26 crc kubenswrapper[4990]: E1003 10:28:26.167471 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerName="registry-server" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.167490 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerName="registry-server" Oct 03 10:28:26 crc kubenswrapper[4990]: E1003 10:28:26.167499 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerName="extract-content" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.167509 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerName="extract-content" Oct 03 10:28:26 crc kubenswrapper[4990]: E1003 10:28:26.167639 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerName="extract-utilities" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.167646 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerName="extract-utilities" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.167812 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6ccc60-3147-46d7-8b97-40b3268df0d3" containerName="registry-server" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.169153 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.199686 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6pp4"] Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.233266 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4cdv\" (UniqueName: \"kubernetes.io/projected/5aba1402-0978-42e8-a6ea-36d22bcee7e6-kube-api-access-m4cdv\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.233327 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-utilities\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.233608 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-catalog-content\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.334561 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-catalog-content\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.334675 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cdv\" (UniqueName: \"kubernetes.io/projected/5aba1402-0978-42e8-a6ea-36d22bcee7e6-kube-api-access-m4cdv\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.334707 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-utilities\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.335291 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-utilities\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.335352 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-catalog-content\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.356574 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cdv\" (UniqueName: \"kubernetes.io/projected/5aba1402-0978-42e8-a6ea-36d22bcee7e6-kube-api-access-m4cdv\") pod \"certified-operators-c6pp4\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.506049 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:26 crc kubenswrapper[4990]: I1003 10:28:26.986598 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c6pp4"] Oct 03 10:28:27 crc kubenswrapper[4990]: I1003 10:28:27.661391 4990 generic.go:334] "Generic (PLEG): container finished" podID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerID="64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16" exitCode=0 Oct 03 10:28:27 crc kubenswrapper[4990]: I1003 10:28:27.661446 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6pp4" event={"ID":"5aba1402-0978-42e8-a6ea-36d22bcee7e6","Type":"ContainerDied","Data":"64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16"} Oct 03 10:28:27 crc kubenswrapper[4990]: I1003 10:28:27.661727 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6pp4" event={"ID":"5aba1402-0978-42e8-a6ea-36d22bcee7e6","Type":"ContainerStarted","Data":"541fb54d5d1c6a5797dac68daee55568071cf52b948d33a9022e50cbd63129d1"} Oct 03 10:28:28 crc kubenswrapper[4990]: I1003 10:28:28.673856 4990 generic.go:334] "Generic (PLEG): container finished" podID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerID="b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3" exitCode=0 Oct 03 10:28:28 crc kubenswrapper[4990]: I1003 10:28:28.673940 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6pp4" event={"ID":"5aba1402-0978-42e8-a6ea-36d22bcee7e6","Type":"ContainerDied","Data":"b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3"} Oct 03 10:28:29 crc kubenswrapper[4990]: I1003 10:28:29.688044 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6pp4" event={"ID":"5aba1402-0978-42e8-a6ea-36d22bcee7e6","Type":"ContainerStarted","Data":"f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21"} Oct 03 10:28:29 crc kubenswrapper[4990]: I1003 10:28:29.714959 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c6pp4" podStartSLOduration=2.207550522 podStartE2EDuration="3.714913199s" podCreationTimestamp="2025-10-03 10:28:26 +0000 UTC" firstStartedPulling="2025-10-03 10:28:27.662811035 +0000 UTC m=+2689.459442882" lastFinishedPulling="2025-10-03 10:28:29.170173692 +0000 UTC m=+2690.966805559" observedRunningTime="2025-10-03 10:28:29.708232956 +0000 UTC m=+2691.504864833" watchObservedRunningTime="2025-10-03 10:28:29.714913199 +0000 UTC m=+2691.511545076" Oct 03 10:28:36 crc kubenswrapper[4990]: I1003 10:28:36.506848 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:36 crc kubenswrapper[4990]: I1003 10:28:36.507349 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:36 crc kubenswrapper[4990]: I1003 10:28:36.563320 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:36 crc kubenswrapper[4990]: I1003 10:28:36.803297 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:36 crc kubenswrapper[4990]: I1003 10:28:36.852336 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6pp4"] Oct 03 10:28:38 crc kubenswrapper[4990]: I1003 10:28:38.761472 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c6pp4" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerName="registry-server" containerID="cri-o://f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21" gracePeriod=2 Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.674609 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.770743 4990 generic.go:334] "Generic (PLEG): container finished" podID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerID="f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21" exitCode=0 Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.770787 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6pp4" event={"ID":"5aba1402-0978-42e8-a6ea-36d22bcee7e6","Type":"ContainerDied","Data":"f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21"} Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.770815 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c6pp4" event={"ID":"5aba1402-0978-42e8-a6ea-36d22bcee7e6","Type":"ContainerDied","Data":"541fb54d5d1c6a5797dac68daee55568071cf52b948d33a9022e50cbd63129d1"} Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.770832 4990 scope.go:117] "RemoveContainer" containerID="f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.770963 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c6pp4" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.787556 4990 scope.go:117] "RemoveContainer" containerID="b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.806198 4990 scope.go:117] "RemoveContainer" containerID="64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.834294 4990 scope.go:117] "RemoveContainer" containerID="f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21" Oct 03 10:28:39 crc kubenswrapper[4990]: E1003 10:28:39.834790 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21\": container with ID starting with f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21 not found: ID does not exist" containerID="f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.834838 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21"} err="failed to get container status \"f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21\": rpc error: code = NotFound desc = could not find container \"f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21\": container with ID starting with f932cfb0fec07ca6eb653b632f4971d28dc35437bfa5339917f601af23dcca21 not found: ID does not exist" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.834868 4990 scope.go:117] "RemoveContainer" containerID="b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3" Oct 03 10:28:39 crc kubenswrapper[4990]: E1003 10:28:39.835392 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3\": container with ID starting with b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3 not found: ID does not exist" containerID="b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.835424 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3"} err="failed to get container status \"b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3\": rpc error: code = NotFound desc = could not find container \"b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3\": container with ID starting with b9ad2c206f015a4277597973a7f490408151b6eb7f7bcc54702dbf8bc1eec2a3 not found: ID does not exist" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.835449 4990 scope.go:117] "RemoveContainer" containerID="64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16" Oct 03 10:28:39 crc kubenswrapper[4990]: E1003 10:28:39.835817 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16\": container with ID starting with 64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16 not found: ID does not exist" containerID="64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.835842 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16"} err="failed to get container status \"64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16\": rpc error: code = NotFound desc = could not find container \"64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16\": container with ID starting with 64667776217fa5b713ff0dccfedac6f2f38177ea79d6cd152ff6c334a4ed6b16 not found: ID does not exist" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.854103 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-utilities\") pod \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.854195 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4cdv\" (UniqueName: \"kubernetes.io/projected/5aba1402-0978-42e8-a6ea-36d22bcee7e6-kube-api-access-m4cdv\") pod \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.854358 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-catalog-content\") pod \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\" (UID: \"5aba1402-0978-42e8-a6ea-36d22bcee7e6\") " Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.855391 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-utilities" (OuterVolumeSpecName: "utilities") pod "5aba1402-0978-42e8-a6ea-36d22bcee7e6" (UID: "5aba1402-0978-42e8-a6ea-36d22bcee7e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.861133 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aba1402-0978-42e8-a6ea-36d22bcee7e6-kube-api-access-m4cdv" (OuterVolumeSpecName: "kube-api-access-m4cdv") pod "5aba1402-0978-42e8-a6ea-36d22bcee7e6" (UID: "5aba1402-0978-42e8-a6ea-36d22bcee7e6"). InnerVolumeSpecName "kube-api-access-m4cdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.957397 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:39 crc kubenswrapper[4990]: I1003 10:28:39.957446 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4cdv\" (UniqueName: \"kubernetes.io/projected/5aba1402-0978-42e8-a6ea-36d22bcee7e6-kube-api-access-m4cdv\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:40 crc kubenswrapper[4990]: I1003 10:28:40.796137 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5aba1402-0978-42e8-a6ea-36d22bcee7e6" (UID: "5aba1402-0978-42e8-a6ea-36d22bcee7e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:28:40 crc kubenswrapper[4990]: I1003 10:28:40.868480 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aba1402-0978-42e8-a6ea-36d22bcee7e6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:41 crc kubenswrapper[4990]: I1003 10:28:41.008360 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c6pp4"] Oct 03 10:28:41 crc kubenswrapper[4990]: I1003 10:28:41.008420 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c6pp4"] Oct 03 10:28:42 crc kubenswrapper[4990]: I1003 10:28:42.881707 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" path="/var/lib/kubelet/pods/5aba1402-0978-42e8-a6ea-36d22bcee7e6/volumes" Oct 03 10:28:55 crc kubenswrapper[4990]: I1003 10:28:55.304471 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:28:55 crc kubenswrapper[4990]: I1003 10:28:55.305111 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:29:25 crc kubenswrapper[4990]: I1003 10:29:25.304413 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:29:25 crc kubenswrapper[4990]: I1003 10:29:25.305284 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:29:25 crc kubenswrapper[4990]: I1003 10:29:25.305350 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:29:25 crc kubenswrapper[4990]: I1003 10:29:25.306261 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6428793fc9e9b192a4c52b9248cddfc26bee246289664fe7e8acbb51dfd2a0d2"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:29:25 crc kubenswrapper[4990]: I1003 10:29:25.306366 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://6428793fc9e9b192a4c52b9248cddfc26bee246289664fe7e8acbb51dfd2a0d2" gracePeriod=600 Oct 03 10:29:26 crc kubenswrapper[4990]: I1003 10:29:26.191931 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="6428793fc9e9b192a4c52b9248cddfc26bee246289664fe7e8acbb51dfd2a0d2" exitCode=0 Oct 03 10:29:26 crc kubenswrapper[4990]: I1003 10:29:26.191991 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"6428793fc9e9b192a4c52b9248cddfc26bee246289664fe7e8acbb51dfd2a0d2"} Oct 03 10:29:26 crc kubenswrapper[4990]: I1003 10:29:26.192347 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac"} Oct 03 10:29:26 crc kubenswrapper[4990]: I1003 10:29:26.192389 4990 scope.go:117] "RemoveContainer" containerID="7bd4381aed38c74d200ac7ef8d38af7e4f4c69ad39b057cc8765caebb495aaec" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.148722 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6"] Oct 03 10:30:00 crc kubenswrapper[4990]: E1003 10:30:00.149609 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerName="extract-utilities" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.149625 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerName="extract-utilities" Oct 03 10:30:00 crc kubenswrapper[4990]: E1003 10:30:00.149657 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerName="registry-server" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.149666 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerName="registry-server" Oct 03 10:30:00 crc kubenswrapper[4990]: E1003 10:30:00.149688 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerName="extract-content" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.149695 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerName="extract-content" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.149845 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aba1402-0978-42e8-a6ea-36d22bcee7e6" containerName="registry-server" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.150302 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.153615 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.154403 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.162799 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6"] Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.302293 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-secret-volume\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.302655 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj8bz\" (UniqueName: \"kubernetes.io/projected/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-kube-api-access-dj8bz\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.302709 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-config-volume\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.404788 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-secret-volume\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.404847 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj8bz\" (UniqueName: \"kubernetes.io/projected/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-kube-api-access-dj8bz\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.404877 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-config-volume\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.405753 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-config-volume\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.412497 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-secret-volume\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.424243 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj8bz\" (UniqueName: \"kubernetes.io/projected/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-kube-api-access-dj8bz\") pod \"collect-profiles-29324790-hzhh6\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.476584 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:00 crc kubenswrapper[4990]: I1003 10:30:00.912459 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6"] Oct 03 10:30:01 crc kubenswrapper[4990]: I1003 10:30:01.498397 4990 generic.go:334] "Generic (PLEG): container finished" podID="a1d9d2a9-a20c-4589-9f8d-016e0c66141f" containerID="f3b15d0020cea5147559c9a3f562c1bf7cd08efba88ec32c78a3e331800e0efe" exitCode=0 Oct 03 10:30:01 crc kubenswrapper[4990]: I1003 10:30:01.498573 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" event={"ID":"a1d9d2a9-a20c-4589-9f8d-016e0c66141f","Type":"ContainerDied","Data":"f3b15d0020cea5147559c9a3f562c1bf7cd08efba88ec32c78a3e331800e0efe"} Oct 03 10:30:01 crc kubenswrapper[4990]: I1003 10:30:01.498708 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" event={"ID":"a1d9d2a9-a20c-4589-9f8d-016e0c66141f","Type":"ContainerStarted","Data":"0cefbf1505e86ad0b492fe960b04f8693461f96088b37b687c13b84b23582e9c"} Oct 03 10:30:02 crc kubenswrapper[4990]: I1003 10:30:02.806214 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:02 crc kubenswrapper[4990]: I1003 10:30:02.934332 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-config-volume\") pod \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " Oct 03 10:30:02 crc kubenswrapper[4990]: I1003 10:30:02.934485 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj8bz\" (UniqueName: \"kubernetes.io/projected/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-kube-api-access-dj8bz\") pod \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " Oct 03 10:30:02 crc kubenswrapper[4990]: I1003 10:30:02.934572 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-secret-volume\") pod \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\" (UID: \"a1d9d2a9-a20c-4589-9f8d-016e0c66141f\") " Oct 03 10:30:02 crc kubenswrapper[4990]: I1003 10:30:02.935988 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a1d9d2a9-a20c-4589-9f8d-016e0c66141f" (UID: "a1d9d2a9-a20c-4589-9f8d-016e0c66141f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:30:02 crc kubenswrapper[4990]: I1003 10:30:02.940029 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-kube-api-access-dj8bz" (OuterVolumeSpecName: "kube-api-access-dj8bz") pod "a1d9d2a9-a20c-4589-9f8d-016e0c66141f" (UID: "a1d9d2a9-a20c-4589-9f8d-016e0c66141f"). InnerVolumeSpecName "kube-api-access-dj8bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:30:02 crc kubenswrapper[4990]: I1003 10:30:02.941057 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a1d9d2a9-a20c-4589-9f8d-016e0c66141f" (UID: "a1d9d2a9-a20c-4589-9f8d-016e0c66141f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:30:03 crc kubenswrapper[4990]: I1003 10:30:03.036159 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:30:03 crc kubenswrapper[4990]: I1003 10:30:03.036198 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj8bz\" (UniqueName: \"kubernetes.io/projected/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-kube-api-access-dj8bz\") on node \"crc\" DevicePath \"\"" Oct 03 10:30:03 crc kubenswrapper[4990]: I1003 10:30:03.036211 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1d9d2a9-a20c-4589-9f8d-016e0c66141f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:30:03 crc kubenswrapper[4990]: I1003 10:30:03.514991 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" event={"ID":"a1d9d2a9-a20c-4589-9f8d-016e0c66141f","Type":"ContainerDied","Data":"0cefbf1505e86ad0b492fe960b04f8693461f96088b37b687c13b84b23582e9c"} Oct 03 10:30:03 crc kubenswrapper[4990]: I1003 10:30:03.515025 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6" Oct 03 10:30:03 crc kubenswrapper[4990]: I1003 10:30:03.515037 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cefbf1505e86ad0b492fe960b04f8693461f96088b37b687c13b84b23582e9c" Oct 03 10:30:03 crc kubenswrapper[4990]: I1003 10:30:03.871066 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78"] Oct 03 10:30:03 crc kubenswrapper[4990]: I1003 10:30:03.875216 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-jtl78"] Oct 03 10:30:04 crc kubenswrapper[4990]: I1003 10:30:04.881673 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd5d267-6907-4d32-9620-dd6270e911f7" path="/var/lib/kubelet/pods/5bd5d267-6907-4d32-9620-dd6270e911f7/volumes" Oct 03 10:30:46 crc kubenswrapper[4990]: I1003 10:30:46.521334 4990 scope.go:117] "RemoveContainer" containerID="2c7968d26ad6d4e4669638e81bfd2a046923998428660605182d899d83fe292d" Oct 03 10:31:25 crc kubenswrapper[4990]: I1003 10:31:25.303941 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:31:25 crc kubenswrapper[4990]: I1003 10:31:25.304692 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:31:55 crc kubenswrapper[4990]: I1003 10:31:55.304276 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:31:55 crc kubenswrapper[4990]: I1003 10:31:55.304950 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.304574 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.305313 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.305383 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.306283 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.306389 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" gracePeriod=600 Oct 03 10:32:25 crc kubenswrapper[4990]: E1003 10:32:25.437050 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.724122 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" exitCode=0 Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.724173 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac"} Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.724210 4990 scope.go:117] "RemoveContainer" containerID="6428793fc9e9b192a4c52b9248cddfc26bee246289664fe7e8acbb51dfd2a0d2" Oct 03 10:32:25 crc kubenswrapper[4990]: I1003 10:32:25.724766 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:32:25 crc kubenswrapper[4990]: E1003 10:32:25.725015 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.069721 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-666sh"] Oct 03 10:32:29 crc kubenswrapper[4990]: E1003 10:32:29.070542 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d9d2a9-a20c-4589-9f8d-016e0c66141f" containerName="collect-profiles" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.070564 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d9d2a9-a20c-4589-9f8d-016e0c66141f" containerName="collect-profiles" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.070846 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d9d2a9-a20c-4589-9f8d-016e0c66141f" containerName="collect-profiles" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.072451 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.096041 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-666sh"] Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.203991 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skxlb\" (UniqueName: \"kubernetes.io/projected/28329718-1bc9-472e-bc10-d2f130107b65-kube-api-access-skxlb\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.204078 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-catalog-content\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.204114 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-utilities\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.305687 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skxlb\" (UniqueName: \"kubernetes.io/projected/28329718-1bc9-472e-bc10-d2f130107b65-kube-api-access-skxlb\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.305769 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-catalog-content\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.305809 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-utilities\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.306293 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-catalog-content\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.306327 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-utilities\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.323230 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skxlb\" (UniqueName: \"kubernetes.io/projected/28329718-1bc9-472e-bc10-d2f130107b65-kube-api-access-skxlb\") pod \"community-operators-666sh\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.414273 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:29 crc kubenswrapper[4990]: I1003 10:32:29.965030 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-666sh"] Oct 03 10:32:30 crc kubenswrapper[4990]: I1003 10:32:30.776627 4990 generic.go:334] "Generic (PLEG): container finished" podID="28329718-1bc9-472e-bc10-d2f130107b65" containerID="31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae" exitCode=0 Oct 03 10:32:30 crc kubenswrapper[4990]: I1003 10:32:30.776790 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-666sh" event={"ID":"28329718-1bc9-472e-bc10-d2f130107b65","Type":"ContainerDied","Data":"31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae"} Oct 03 10:32:30 crc kubenswrapper[4990]: I1003 10:32:30.777053 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-666sh" event={"ID":"28329718-1bc9-472e-bc10-d2f130107b65","Type":"ContainerStarted","Data":"4574639f98b0d31010e156a0534a10e265260bc0a1c8528e60b0a1be1a9a8979"} Oct 03 10:32:31 crc kubenswrapper[4990]: I1003 10:32:31.792014 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-666sh" event={"ID":"28329718-1bc9-472e-bc10-d2f130107b65","Type":"ContainerStarted","Data":"9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714"} Oct 03 10:32:32 crc kubenswrapper[4990]: I1003 10:32:32.803631 4990 generic.go:334] "Generic (PLEG): container finished" podID="28329718-1bc9-472e-bc10-d2f130107b65" containerID="9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714" exitCode=0 Oct 03 10:32:32 crc kubenswrapper[4990]: I1003 10:32:32.803685 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-666sh" event={"ID":"28329718-1bc9-472e-bc10-d2f130107b65","Type":"ContainerDied","Data":"9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714"} Oct 03 10:32:33 crc kubenswrapper[4990]: I1003 10:32:33.814220 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-666sh" event={"ID":"28329718-1bc9-472e-bc10-d2f130107b65","Type":"ContainerStarted","Data":"66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00"} Oct 03 10:32:33 crc kubenswrapper[4990]: I1003 10:32:33.831716 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-666sh" podStartSLOduration=2.20929342 podStartE2EDuration="4.831699747s" podCreationTimestamp="2025-10-03 10:32:29 +0000 UTC" firstStartedPulling="2025-10-03 10:32:30.779314252 +0000 UTC m=+2932.575946150" lastFinishedPulling="2025-10-03 10:32:33.40172059 +0000 UTC m=+2935.198352477" observedRunningTime="2025-10-03 10:32:33.829198529 +0000 UTC m=+2935.625830406" watchObservedRunningTime="2025-10-03 10:32:33.831699747 +0000 UTC m=+2935.628331604" Oct 03 10:32:37 crc kubenswrapper[4990]: I1003 10:32:37.872000 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:32:37 crc kubenswrapper[4990]: E1003 10:32:37.872920 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:32:39 crc kubenswrapper[4990]: I1003 10:32:39.415710 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:39 crc kubenswrapper[4990]: I1003 10:32:39.416157 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:39 crc kubenswrapper[4990]: I1003 10:32:39.481657 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:39 crc kubenswrapper[4990]: I1003 10:32:39.961341 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:40 crc kubenswrapper[4990]: I1003 10:32:40.030697 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-666sh"] Oct 03 10:32:41 crc kubenswrapper[4990]: I1003 10:32:41.882192 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-666sh" podUID="28329718-1bc9-472e-bc10-d2f130107b65" containerName="registry-server" containerID="cri-o://66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00" gracePeriod=2 Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.330566 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.505818 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skxlb\" (UniqueName: \"kubernetes.io/projected/28329718-1bc9-472e-bc10-d2f130107b65-kube-api-access-skxlb\") pod \"28329718-1bc9-472e-bc10-d2f130107b65\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.505931 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-utilities\") pod \"28329718-1bc9-472e-bc10-d2f130107b65\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.506020 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-catalog-content\") pod \"28329718-1bc9-472e-bc10-d2f130107b65\" (UID: \"28329718-1bc9-472e-bc10-d2f130107b65\") " Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.507740 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-utilities" (OuterVolumeSpecName: "utilities") pod "28329718-1bc9-472e-bc10-d2f130107b65" (UID: "28329718-1bc9-472e-bc10-d2f130107b65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.513159 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28329718-1bc9-472e-bc10-d2f130107b65-kube-api-access-skxlb" (OuterVolumeSpecName: "kube-api-access-skxlb") pod "28329718-1bc9-472e-bc10-d2f130107b65" (UID: "28329718-1bc9-472e-bc10-d2f130107b65"). InnerVolumeSpecName "kube-api-access-skxlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.601921 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28329718-1bc9-472e-bc10-d2f130107b65" (UID: "28329718-1bc9-472e-bc10-d2f130107b65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.607789 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.607852 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28329718-1bc9-472e-bc10-d2f130107b65-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.607874 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skxlb\" (UniqueName: \"kubernetes.io/projected/28329718-1bc9-472e-bc10-d2f130107b65-kube-api-access-skxlb\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.893610 4990 generic.go:334] "Generic (PLEG): container finished" podID="28329718-1bc9-472e-bc10-d2f130107b65" containerID="66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00" exitCode=0 Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.893682 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-666sh" event={"ID":"28329718-1bc9-472e-bc10-d2f130107b65","Type":"ContainerDied","Data":"66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00"} Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.893727 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-666sh" event={"ID":"28329718-1bc9-472e-bc10-d2f130107b65","Type":"ContainerDied","Data":"4574639f98b0d31010e156a0534a10e265260bc0a1c8528e60b0a1be1a9a8979"} Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.893743 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-666sh" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.893764 4990 scope.go:117] "RemoveContainer" containerID="66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.936554 4990 scope.go:117] "RemoveContainer" containerID="9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.939848 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-666sh"] Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.955084 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-666sh"] Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.967722 4990 scope.go:117] "RemoveContainer" containerID="31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.995120 4990 scope.go:117] "RemoveContainer" containerID="66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00" Oct 03 10:32:42 crc kubenswrapper[4990]: E1003 10:32:42.995488 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00\": container with ID starting with 66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00 not found: ID does not exist" containerID="66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.995585 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00"} err="failed to get container status \"66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00\": rpc error: code = NotFound desc = could not find container \"66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00\": container with ID starting with 66e281007a128efd5aa1a942e5ce32ed0e7c1948beaa9afe284b34a7f4f6ec00 not found: ID does not exist" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.995628 4990 scope.go:117] "RemoveContainer" containerID="9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714" Oct 03 10:32:42 crc kubenswrapper[4990]: E1003 10:32:42.995875 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714\": container with ID starting with 9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714 not found: ID does not exist" containerID="9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.995903 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714"} err="failed to get container status \"9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714\": rpc error: code = NotFound desc = could not find container \"9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714\": container with ID starting with 9876bdd30b179bcc9b305b8e63a0aa01f23e1412dd79ce083bb3099f60ef2714 not found: ID does not exist" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.995922 4990 scope.go:117] "RemoveContainer" containerID="31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae" Oct 03 10:32:42 crc kubenswrapper[4990]: E1003 10:32:42.996207 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae\": container with ID starting with 31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae not found: ID does not exist" containerID="31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae" Oct 03 10:32:42 crc kubenswrapper[4990]: I1003 10:32:42.996260 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae"} err="failed to get container status \"31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae\": rpc error: code = NotFound desc = could not find container \"31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae\": container with ID starting with 31e9c3cdb3f7cb3fd485b81d6de864c22174f6a67cab800ad9a928673dc32fae not found: ID does not exist" Oct 03 10:32:44 crc kubenswrapper[4990]: I1003 10:32:44.892536 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28329718-1bc9-472e-bc10-d2f130107b65" path="/var/lib/kubelet/pods/28329718-1bc9-472e-bc10-d2f130107b65/volumes" Oct 03 10:32:48 crc kubenswrapper[4990]: I1003 10:32:48.881239 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:32:48 crc kubenswrapper[4990]: E1003 10:32:48.881931 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:33:03 crc kubenswrapper[4990]: I1003 10:33:03.871816 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:33:03 crc kubenswrapper[4990]: E1003 10:33:03.872806 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:33:17 crc kubenswrapper[4990]: I1003 10:33:17.880296 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:33:17 crc kubenswrapper[4990]: E1003 10:33:17.881015 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:33:28 crc kubenswrapper[4990]: I1003 10:33:28.877499 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:33:28 crc kubenswrapper[4990]: E1003 10:33:28.878336 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:33:42 crc kubenswrapper[4990]: I1003 10:33:42.872710 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:33:42 crc kubenswrapper[4990]: E1003 10:33:42.873780 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:33:53 crc kubenswrapper[4990]: I1003 10:33:53.872665 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:33:53 crc kubenswrapper[4990]: E1003 10:33:53.873853 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:34:04 crc kubenswrapper[4990]: I1003 10:34:04.871949 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:34:04 crc kubenswrapper[4990]: E1003 10:34:04.872765 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:34:16 crc kubenswrapper[4990]: I1003 10:34:16.872067 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:34:16 crc kubenswrapper[4990]: E1003 10:34:16.872777 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:34:31 crc kubenswrapper[4990]: I1003 10:34:31.871407 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:34:31 crc kubenswrapper[4990]: E1003 10:34:31.872112 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:34:43 crc kubenswrapper[4990]: I1003 10:34:43.873262 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:34:43 crc kubenswrapper[4990]: E1003 10:34:43.874260 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:34:55 crc kubenswrapper[4990]: I1003 10:34:55.871466 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:34:55 crc kubenswrapper[4990]: E1003 10:34:55.872418 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.098686 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ddcqv"] Oct 03 10:35:02 crc kubenswrapper[4990]: E1003 10:35:02.099817 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28329718-1bc9-472e-bc10-d2f130107b65" containerName="registry-server" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.099838 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="28329718-1bc9-472e-bc10-d2f130107b65" containerName="registry-server" Oct 03 10:35:02 crc kubenswrapper[4990]: E1003 10:35:02.099863 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28329718-1bc9-472e-bc10-d2f130107b65" containerName="extract-utilities" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.099871 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="28329718-1bc9-472e-bc10-d2f130107b65" containerName="extract-utilities" Oct 03 10:35:02 crc kubenswrapper[4990]: E1003 10:35:02.099905 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28329718-1bc9-472e-bc10-d2f130107b65" containerName="extract-content" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.099916 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="28329718-1bc9-472e-bc10-d2f130107b65" containerName="extract-content" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.100099 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="28329718-1bc9-472e-bc10-d2f130107b65" containerName="registry-server" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.102015 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.114976 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddcqv"] Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.151844 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-catalog-content\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.151911 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npfzr\" (UniqueName: \"kubernetes.io/projected/96df30c2-27a2-4af7-88ec-2c64497c45ea-kube-api-access-npfzr\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.151941 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-utilities\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.253724 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-catalog-content\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.254050 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npfzr\" (UniqueName: \"kubernetes.io/projected/96df30c2-27a2-4af7-88ec-2c64497c45ea-kube-api-access-npfzr\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.254090 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-utilities\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.254264 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-catalog-content\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.254415 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-utilities\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.279429 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npfzr\" (UniqueName: \"kubernetes.io/projected/96df30c2-27a2-4af7-88ec-2c64497c45ea-kube-api-access-npfzr\") pod \"redhat-marketplace-ddcqv\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.431177 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:02 crc kubenswrapper[4990]: I1003 10:35:02.817635 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddcqv"] Oct 03 10:35:03 crc kubenswrapper[4990]: I1003 10:35:03.194058 4990 generic.go:334] "Generic (PLEG): container finished" podID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerID="0d18dfa869148e0397fd6b0c22c78def745531f353b0ad36e53b66d8d5faa3de" exitCode=0 Oct 03 10:35:03 crc kubenswrapper[4990]: I1003 10:35:03.194125 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddcqv" event={"ID":"96df30c2-27a2-4af7-88ec-2c64497c45ea","Type":"ContainerDied","Data":"0d18dfa869148e0397fd6b0c22c78def745531f353b0ad36e53b66d8d5faa3de"} Oct 03 10:35:03 crc kubenswrapper[4990]: I1003 10:35:03.194445 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddcqv" event={"ID":"96df30c2-27a2-4af7-88ec-2c64497c45ea","Type":"ContainerStarted","Data":"4bb1f12338e87eae2512fa55923a8310306f85d5a8e5fafb359e6cceef6c9b35"} Oct 03 10:35:03 crc kubenswrapper[4990]: I1003 10:35:03.195954 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:35:05 crc kubenswrapper[4990]: I1003 10:35:05.212754 4990 generic.go:334] "Generic (PLEG): container finished" podID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerID="da8f801db9bfc04790cea01b57ba42a99ce29ea51b270958bdfe0d8e00d8e105" exitCode=0 Oct 03 10:35:05 crc kubenswrapper[4990]: I1003 10:35:05.212808 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddcqv" event={"ID":"96df30c2-27a2-4af7-88ec-2c64497c45ea","Type":"ContainerDied","Data":"da8f801db9bfc04790cea01b57ba42a99ce29ea51b270958bdfe0d8e00d8e105"} Oct 03 10:35:06 crc kubenswrapper[4990]: I1003 10:35:06.225801 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddcqv" event={"ID":"96df30c2-27a2-4af7-88ec-2c64497c45ea","Type":"ContainerStarted","Data":"3419837a0a706ccd6cdaba7477db40eccf5b971fb8e70213fa967bc651f0c088"} Oct 03 10:35:06 crc kubenswrapper[4990]: I1003 10:35:06.246422 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ddcqv" podStartSLOduration=1.8435925549999999 podStartE2EDuration="4.246400084s" podCreationTimestamp="2025-10-03 10:35:02 +0000 UTC" firstStartedPulling="2025-10-03 10:35:03.195555065 +0000 UTC m=+3084.992186922" lastFinishedPulling="2025-10-03 10:35:05.598362554 +0000 UTC m=+3087.394994451" observedRunningTime="2025-10-03 10:35:06.243981141 +0000 UTC m=+3088.040613048" watchObservedRunningTime="2025-10-03 10:35:06.246400084 +0000 UTC m=+3088.043031961" Oct 03 10:35:07 crc kubenswrapper[4990]: I1003 10:35:07.871939 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:35:07 crc kubenswrapper[4990]: E1003 10:35:07.872687 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:35:12 crc kubenswrapper[4990]: I1003 10:35:12.432340 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:12 crc kubenswrapper[4990]: I1003 10:35:12.432762 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:12 crc kubenswrapper[4990]: I1003 10:35:12.496719 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:13 crc kubenswrapper[4990]: I1003 10:35:13.354206 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:13 crc kubenswrapper[4990]: I1003 10:35:13.418551 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddcqv"] Oct 03 10:35:15 crc kubenswrapper[4990]: I1003 10:35:15.310907 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ddcqv" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerName="registry-server" containerID="cri-o://3419837a0a706ccd6cdaba7477db40eccf5b971fb8e70213fa967bc651f0c088" gracePeriod=2 Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.323029 4990 generic.go:334] "Generic (PLEG): container finished" podID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerID="3419837a0a706ccd6cdaba7477db40eccf5b971fb8e70213fa967bc651f0c088" exitCode=0 Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.323152 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddcqv" event={"ID":"96df30c2-27a2-4af7-88ec-2c64497c45ea","Type":"ContainerDied","Data":"3419837a0a706ccd6cdaba7477db40eccf5b971fb8e70213fa967bc651f0c088"} Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.561993 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.594137 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npfzr\" (UniqueName: \"kubernetes.io/projected/96df30c2-27a2-4af7-88ec-2c64497c45ea-kube-api-access-npfzr\") pod \"96df30c2-27a2-4af7-88ec-2c64497c45ea\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.601169 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96df30c2-27a2-4af7-88ec-2c64497c45ea-kube-api-access-npfzr" (OuterVolumeSpecName: "kube-api-access-npfzr") pod "96df30c2-27a2-4af7-88ec-2c64497c45ea" (UID: "96df30c2-27a2-4af7-88ec-2c64497c45ea"). InnerVolumeSpecName "kube-api-access-npfzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.695433 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-catalog-content\") pod \"96df30c2-27a2-4af7-88ec-2c64497c45ea\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.695480 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-utilities\") pod \"96df30c2-27a2-4af7-88ec-2c64497c45ea\" (UID: \"96df30c2-27a2-4af7-88ec-2c64497c45ea\") " Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.695768 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npfzr\" (UniqueName: \"kubernetes.io/projected/96df30c2-27a2-4af7-88ec-2c64497c45ea-kube-api-access-npfzr\") on node \"crc\" DevicePath \"\"" Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.696247 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-utilities" (OuterVolumeSpecName: "utilities") pod "96df30c2-27a2-4af7-88ec-2c64497c45ea" (UID: "96df30c2-27a2-4af7-88ec-2c64497c45ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.709849 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96df30c2-27a2-4af7-88ec-2c64497c45ea" (UID: "96df30c2-27a2-4af7-88ec-2c64497c45ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.797592 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:35:16 crc kubenswrapper[4990]: I1003 10:35:16.797683 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96df30c2-27a2-4af7-88ec-2c64497c45ea-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:35:17 crc kubenswrapper[4990]: I1003 10:35:17.333693 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ddcqv" event={"ID":"96df30c2-27a2-4af7-88ec-2c64497c45ea","Type":"ContainerDied","Data":"4bb1f12338e87eae2512fa55923a8310306f85d5a8e5fafb359e6cceef6c9b35"} Oct 03 10:35:17 crc kubenswrapper[4990]: I1003 10:35:17.334293 4990 scope.go:117] "RemoveContainer" containerID="3419837a0a706ccd6cdaba7477db40eccf5b971fb8e70213fa967bc651f0c088" Oct 03 10:35:17 crc kubenswrapper[4990]: I1003 10:35:17.333808 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ddcqv" Oct 03 10:35:17 crc kubenswrapper[4990]: I1003 10:35:17.361461 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddcqv"] Oct 03 10:35:17 crc kubenswrapper[4990]: I1003 10:35:17.365951 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ddcqv"] Oct 03 10:35:17 crc kubenswrapper[4990]: I1003 10:35:17.377858 4990 scope.go:117] "RemoveContainer" containerID="da8f801db9bfc04790cea01b57ba42a99ce29ea51b270958bdfe0d8e00d8e105" Oct 03 10:35:17 crc kubenswrapper[4990]: I1003 10:35:17.400781 4990 scope.go:117] "RemoveContainer" containerID="0d18dfa869148e0397fd6b0c22c78def745531f353b0ad36e53b66d8d5faa3de" Oct 03 10:35:18 crc kubenswrapper[4990]: I1003 10:35:18.891807 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" path="/var/lib/kubelet/pods/96df30c2-27a2-4af7-88ec-2c64497c45ea/volumes" Oct 03 10:35:21 crc kubenswrapper[4990]: I1003 10:35:21.872295 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:35:21 crc kubenswrapper[4990]: E1003 10:35:21.872811 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:35:36 crc kubenswrapper[4990]: I1003 10:35:36.872183 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:35:36 crc kubenswrapper[4990]: E1003 10:35:36.872957 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:35:49 crc kubenswrapper[4990]: I1003 10:35:49.872018 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:35:49 crc kubenswrapper[4990]: E1003 10:35:49.872844 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:36:01 crc kubenswrapper[4990]: I1003 10:36:01.872598 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:36:01 crc kubenswrapper[4990]: E1003 10:36:01.873646 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:36:16 crc kubenswrapper[4990]: I1003 10:36:16.871886 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:36:16 crc kubenswrapper[4990]: E1003 10:36:16.872663 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:36:30 crc kubenswrapper[4990]: I1003 10:36:30.872167 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:36:30 crc kubenswrapper[4990]: E1003 10:36:30.872975 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:36:44 crc kubenswrapper[4990]: I1003 10:36:44.872069 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:36:44 crc kubenswrapper[4990]: E1003 10:36:44.872934 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:36:57 crc kubenswrapper[4990]: I1003 10:36:57.871969 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:36:57 crc kubenswrapper[4990]: E1003 10:36:57.873148 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:37:11 crc kubenswrapper[4990]: I1003 10:37:11.871272 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:37:11 crc kubenswrapper[4990]: E1003 10:37:11.871964 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:37:25 crc kubenswrapper[4990]: I1003 10:37:25.871888 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:37:26 crc kubenswrapper[4990]: I1003 10:37:26.407492 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"3e229b314a8e85bcc03a52d706a25d2fecb620f8876f20b701842ea7a6376ed8"} Oct 03 10:38:45 crc kubenswrapper[4990]: I1003 10:38:45.901431 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7jc4l"] Oct 03 10:38:45 crc kubenswrapper[4990]: E1003 10:38:45.902764 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerName="registry-server" Oct 03 10:38:45 crc kubenswrapper[4990]: I1003 10:38:45.902783 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerName="registry-server" Oct 03 10:38:45 crc kubenswrapper[4990]: E1003 10:38:45.902796 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerName="extract-content" Oct 03 10:38:45 crc kubenswrapper[4990]: I1003 10:38:45.902812 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerName="extract-content" Oct 03 10:38:45 crc kubenswrapper[4990]: E1003 10:38:45.902838 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerName="extract-utilities" Oct 03 10:38:45 crc kubenswrapper[4990]: I1003 10:38:45.902859 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerName="extract-utilities" Oct 03 10:38:45 crc kubenswrapper[4990]: I1003 10:38:45.903035 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="96df30c2-27a2-4af7-88ec-2c64497c45ea" containerName="registry-server" Oct 03 10:38:45 crc kubenswrapper[4990]: I1003 10:38:45.905290 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:45 crc kubenswrapper[4990]: I1003 10:38:45.921091 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7jc4l"] Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.007569 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9978n\" (UniqueName: \"kubernetes.io/projected/14b97adc-860d-4382-9b4b-1da0c57aaf70-kube-api-access-9978n\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.007835 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-utilities\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.007883 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-catalog-content\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.109931 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-utilities\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.109987 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-catalog-content\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.110024 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9978n\" (UniqueName: \"kubernetes.io/projected/14b97adc-860d-4382-9b4b-1da0c57aaf70-kube-api-access-9978n\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.110854 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-utilities\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.111177 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-catalog-content\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.134449 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9978n\" (UniqueName: \"kubernetes.io/projected/14b97adc-860d-4382-9b4b-1da0c57aaf70-kube-api-access-9978n\") pod \"redhat-operators-7jc4l\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.236959 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.479022 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7jc4l"] Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.501275 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w72kl"] Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.517191 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.562971 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w72kl"] Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.619655 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t464\" (UniqueName: \"kubernetes.io/projected/46d62848-592d-4533-90c1-78ab32eb2457-kube-api-access-7t464\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.619733 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-utilities\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.619754 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-catalog-content\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.721611 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t464\" (UniqueName: \"kubernetes.io/projected/46d62848-592d-4533-90c1-78ab32eb2457-kube-api-access-7t464\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.721720 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-utilities\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.721748 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-catalog-content\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.722363 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-catalog-content\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.722443 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-utilities\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.740588 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t464\" (UniqueName: \"kubernetes.io/projected/46d62848-592d-4533-90c1-78ab32eb2457-kube-api-access-7t464\") pod \"certified-operators-w72kl\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:46 crc kubenswrapper[4990]: I1003 10:38:46.876200 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:47 crc kubenswrapper[4990]: I1003 10:38:47.061675 4990 generic.go:334] "Generic (PLEG): container finished" podID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerID="30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4" exitCode=0 Oct 03 10:38:47 crc kubenswrapper[4990]: I1003 10:38:47.061756 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jc4l" event={"ID":"14b97adc-860d-4382-9b4b-1da0c57aaf70","Type":"ContainerDied","Data":"30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4"} Oct 03 10:38:47 crc kubenswrapper[4990]: I1003 10:38:47.061946 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jc4l" event={"ID":"14b97adc-860d-4382-9b4b-1da0c57aaf70","Type":"ContainerStarted","Data":"23d168e3a079e2e9cc53b6bf66367778b1de9a2e69cf5f0158cc4021731291ab"} Oct 03 10:38:47 crc kubenswrapper[4990]: I1003 10:38:47.204057 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w72kl"] Oct 03 10:38:48 crc kubenswrapper[4990]: I1003 10:38:48.073105 4990 generic.go:334] "Generic (PLEG): container finished" podID="46d62848-592d-4533-90c1-78ab32eb2457" containerID="b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1" exitCode=0 Oct 03 10:38:48 crc kubenswrapper[4990]: I1003 10:38:48.073175 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w72kl" event={"ID":"46d62848-592d-4533-90c1-78ab32eb2457","Type":"ContainerDied","Data":"b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1"} Oct 03 10:38:48 crc kubenswrapper[4990]: I1003 10:38:48.073536 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w72kl" event={"ID":"46d62848-592d-4533-90c1-78ab32eb2457","Type":"ContainerStarted","Data":"b315c17575873b060bb17b63bc8fc2cac59a23daffa64c398d91f455bc04c339"} Oct 03 10:38:49 crc kubenswrapper[4990]: I1003 10:38:49.091801 4990 generic.go:334] "Generic (PLEG): container finished" podID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerID="9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f" exitCode=0 Oct 03 10:38:49 crc kubenswrapper[4990]: I1003 10:38:49.091886 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jc4l" event={"ID":"14b97adc-860d-4382-9b4b-1da0c57aaf70","Type":"ContainerDied","Data":"9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f"} Oct 03 10:38:49 crc kubenswrapper[4990]: I1003 10:38:49.095334 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w72kl" event={"ID":"46d62848-592d-4533-90c1-78ab32eb2457","Type":"ContainerStarted","Data":"804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18"} Oct 03 10:38:50 crc kubenswrapper[4990]: I1003 10:38:50.111993 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jc4l" event={"ID":"14b97adc-860d-4382-9b4b-1da0c57aaf70","Type":"ContainerStarted","Data":"b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6"} Oct 03 10:38:50 crc kubenswrapper[4990]: I1003 10:38:50.117141 4990 generic.go:334] "Generic (PLEG): container finished" podID="46d62848-592d-4533-90c1-78ab32eb2457" containerID="804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18" exitCode=0 Oct 03 10:38:50 crc kubenswrapper[4990]: I1003 10:38:50.117191 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w72kl" event={"ID":"46d62848-592d-4533-90c1-78ab32eb2457","Type":"ContainerDied","Data":"804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18"} Oct 03 10:38:50 crc kubenswrapper[4990]: I1003 10:38:50.143066 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7jc4l" podStartSLOduration=2.463172726 podStartE2EDuration="5.143044515s" podCreationTimestamp="2025-10-03 10:38:45 +0000 UTC" firstStartedPulling="2025-10-03 10:38:47.062912149 +0000 UTC m=+3308.859544006" lastFinishedPulling="2025-10-03 10:38:49.742783888 +0000 UTC m=+3311.539415795" observedRunningTime="2025-10-03 10:38:50.138145149 +0000 UTC m=+3311.934777046" watchObservedRunningTime="2025-10-03 10:38:50.143044515 +0000 UTC m=+3311.939676402" Oct 03 10:38:51 crc kubenswrapper[4990]: I1003 10:38:51.136888 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w72kl" event={"ID":"46d62848-592d-4533-90c1-78ab32eb2457","Type":"ContainerStarted","Data":"fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad"} Oct 03 10:38:51 crc kubenswrapper[4990]: I1003 10:38:51.157014 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w72kl" podStartSLOduration=2.324368017 podStartE2EDuration="5.156996049s" podCreationTimestamp="2025-10-03 10:38:46 +0000 UTC" firstStartedPulling="2025-10-03 10:38:48.075578029 +0000 UTC m=+3309.872209886" lastFinishedPulling="2025-10-03 10:38:50.908206051 +0000 UTC m=+3312.704837918" observedRunningTime="2025-10-03 10:38:51.15200252 +0000 UTC m=+3312.948634437" watchObservedRunningTime="2025-10-03 10:38:51.156996049 +0000 UTC m=+3312.953627906" Oct 03 10:38:56 crc kubenswrapper[4990]: I1003 10:38:56.237912 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:56 crc kubenswrapper[4990]: I1003 10:38:56.238651 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:56 crc kubenswrapper[4990]: I1003 10:38:56.316781 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:56 crc kubenswrapper[4990]: I1003 10:38:56.890037 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:56 crc kubenswrapper[4990]: I1003 10:38:56.890106 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:56 crc kubenswrapper[4990]: I1003 10:38:56.941745 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:57 crc kubenswrapper[4990]: I1003 10:38:57.239444 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:57 crc kubenswrapper[4990]: I1003 10:38:57.256712 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:38:57 crc kubenswrapper[4990]: I1003 10:38:57.770329 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7jc4l"] Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.200408 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7jc4l" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerName="registry-server" containerID="cri-o://b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6" gracePeriod=2 Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.570362 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w72kl"] Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.570718 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w72kl" podUID="46d62848-592d-4533-90c1-78ab32eb2457" containerName="registry-server" containerID="cri-o://fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad" gracePeriod=2 Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.773525 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.917117 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-utilities\") pod \"14b97adc-860d-4382-9b4b-1da0c57aaf70\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.917252 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9978n\" (UniqueName: \"kubernetes.io/projected/14b97adc-860d-4382-9b4b-1da0c57aaf70-kube-api-access-9978n\") pod \"14b97adc-860d-4382-9b4b-1da0c57aaf70\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.917394 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-catalog-content\") pod \"14b97adc-860d-4382-9b4b-1da0c57aaf70\" (UID: \"14b97adc-860d-4382-9b4b-1da0c57aaf70\") " Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.918267 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-utilities" (OuterVolumeSpecName: "utilities") pod "14b97adc-860d-4382-9b4b-1da0c57aaf70" (UID: "14b97adc-860d-4382-9b4b-1da0c57aaf70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.924022 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b97adc-860d-4382-9b4b-1da0c57aaf70-kube-api-access-9978n" (OuterVolumeSpecName: "kube-api-access-9978n") pod "14b97adc-860d-4382-9b4b-1da0c57aaf70" (UID: "14b97adc-860d-4382-9b4b-1da0c57aaf70"). InnerVolumeSpecName "kube-api-access-9978n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:38:59 crc kubenswrapper[4990]: I1003 10:38:59.927730 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.017097 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14b97adc-860d-4382-9b4b-1da0c57aaf70" (UID: "14b97adc-860d-4382-9b4b-1da0c57aaf70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.019551 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9978n\" (UniqueName: \"kubernetes.io/projected/14b97adc-860d-4382-9b4b-1da0c57aaf70-kube-api-access-9978n\") on node \"crc\" DevicePath \"\"" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.019587 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.019600 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14b97adc-860d-4382-9b4b-1da0c57aaf70-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.120613 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-catalog-content\") pod \"46d62848-592d-4533-90c1-78ab32eb2457\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.120702 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t464\" (UniqueName: \"kubernetes.io/projected/46d62848-592d-4533-90c1-78ab32eb2457-kube-api-access-7t464\") pod \"46d62848-592d-4533-90c1-78ab32eb2457\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.120821 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-utilities\") pod \"46d62848-592d-4533-90c1-78ab32eb2457\" (UID: \"46d62848-592d-4533-90c1-78ab32eb2457\") " Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.121978 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-utilities" (OuterVolumeSpecName: "utilities") pod "46d62848-592d-4533-90c1-78ab32eb2457" (UID: "46d62848-592d-4533-90c1-78ab32eb2457"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.123346 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d62848-592d-4533-90c1-78ab32eb2457-kube-api-access-7t464" (OuterVolumeSpecName: "kube-api-access-7t464") pod "46d62848-592d-4533-90c1-78ab32eb2457" (UID: "46d62848-592d-4533-90c1-78ab32eb2457"). InnerVolumeSpecName "kube-api-access-7t464". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.197703 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46d62848-592d-4533-90c1-78ab32eb2457" (UID: "46d62848-592d-4533-90c1-78ab32eb2457"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.211105 4990 generic.go:334] "Generic (PLEG): container finished" podID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerID="b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6" exitCode=0 Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.211193 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jc4l" event={"ID":"14b97adc-860d-4382-9b4b-1da0c57aaf70","Type":"ContainerDied","Data":"b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6"} Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.211245 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jc4l" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.211262 4990 scope.go:117] "RemoveContainer" containerID="b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.211250 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jc4l" event={"ID":"14b97adc-860d-4382-9b4b-1da0c57aaf70","Type":"ContainerDied","Data":"23d168e3a079e2e9cc53b6bf66367778b1de9a2e69cf5f0158cc4021731291ab"} Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.215466 4990 generic.go:334] "Generic (PLEG): container finished" podID="46d62848-592d-4533-90c1-78ab32eb2457" containerID="fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad" exitCode=0 Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.215563 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w72kl" event={"ID":"46d62848-592d-4533-90c1-78ab32eb2457","Type":"ContainerDied","Data":"fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad"} Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.215597 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w72kl" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.215608 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w72kl" event={"ID":"46d62848-592d-4533-90c1-78ab32eb2457","Type":"ContainerDied","Data":"b315c17575873b060bb17b63bc8fc2cac59a23daffa64c398d91f455bc04c339"} Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.222235 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.222264 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d62848-592d-4533-90c1-78ab32eb2457-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.222276 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t464\" (UniqueName: \"kubernetes.io/projected/46d62848-592d-4533-90c1-78ab32eb2457-kube-api-access-7t464\") on node \"crc\" DevicePath \"\"" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.244651 4990 scope.go:117] "RemoveContainer" containerID="9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.286575 4990 scope.go:117] "RemoveContainer" containerID="30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.287892 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w72kl"] Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.295659 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w72kl"] Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.300597 4990 scope.go:117] "RemoveContainer" containerID="b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6" Oct 03 10:39:00 crc kubenswrapper[4990]: E1003 10:39:00.301011 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6\": container with ID starting with b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6 not found: ID does not exist" containerID="b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.301051 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6"} err="failed to get container status \"b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6\": rpc error: code = NotFound desc = could not find container \"b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6\": container with ID starting with b62540f1bfcdd771b5bb9bc1c192387e4c8512651a9a1610b63c27ca76b66bb6 not found: ID does not exist" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.301081 4990 scope.go:117] "RemoveContainer" containerID="9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f" Oct 03 10:39:00 crc kubenswrapper[4990]: E1003 10:39:00.301362 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f\": container with ID starting with 9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f not found: ID does not exist" containerID="9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.301392 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f"} err="failed to get container status \"9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f\": rpc error: code = NotFound desc = could not find container \"9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f\": container with ID starting with 9329bae5ce60d0fa0e138b2ca031a94a4f0f78ee9e842ab5adfebd454cc7859f not found: ID does not exist" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.301408 4990 scope.go:117] "RemoveContainer" containerID="30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4" Oct 03 10:39:00 crc kubenswrapper[4990]: E1003 10:39:00.301623 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4\": container with ID starting with 30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4 not found: ID does not exist" containerID="30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.301649 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4"} err="failed to get container status \"30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4\": rpc error: code = NotFound desc = could not find container \"30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4\": container with ID starting with 30c554d9e5c440aed660b39307b1cb277cc1656fff02f80ad1da535fbe8b4bc4 not found: ID does not exist" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.301662 4990 scope.go:117] "RemoveContainer" containerID="fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.302315 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7jc4l"] Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.308685 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7jc4l"] Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.313544 4990 scope.go:117] "RemoveContainer" containerID="804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.328472 4990 scope.go:117] "RemoveContainer" containerID="b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.347787 4990 scope.go:117] "RemoveContainer" containerID="fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad" Oct 03 10:39:00 crc kubenswrapper[4990]: E1003 10:39:00.348330 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad\": container with ID starting with fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad not found: ID does not exist" containerID="fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.348371 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad"} err="failed to get container status \"fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad\": rpc error: code = NotFound desc = could not find container \"fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad\": container with ID starting with fa1a70e21bcccdb5f1d01d577f5d5673903078e8393b6342a1149c2b9ad852ad not found: ID does not exist" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.348399 4990 scope.go:117] "RemoveContainer" containerID="804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18" Oct 03 10:39:00 crc kubenswrapper[4990]: E1003 10:39:00.348688 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18\": container with ID starting with 804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18 not found: ID does not exist" containerID="804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.348710 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18"} err="failed to get container status \"804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18\": rpc error: code = NotFound desc = could not find container \"804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18\": container with ID starting with 804e07d6bdf862c043c91a10bb7c9e51b0712559b0d1702d4446398fdeb27b18 not found: ID does not exist" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.348725 4990 scope.go:117] "RemoveContainer" containerID="b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1" Oct 03 10:39:00 crc kubenswrapper[4990]: E1003 10:39:00.349024 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1\": container with ID starting with b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1 not found: ID does not exist" containerID="b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.349050 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1"} err="failed to get container status \"b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1\": rpc error: code = NotFound desc = could not find container \"b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1\": container with ID starting with b2c098651f07d1b062ab1f500bf3c81211e700ef309bdc045c45558c1da5c3d1 not found: ID does not exist" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.888920 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" path="/var/lib/kubelet/pods/14b97adc-860d-4382-9b4b-1da0c57aaf70/volumes" Oct 03 10:39:00 crc kubenswrapper[4990]: I1003 10:39:00.890388 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d62848-592d-4533-90c1-78ab32eb2457" path="/var/lib/kubelet/pods/46d62848-592d-4533-90c1-78ab32eb2457/volumes" Oct 03 10:39:55 crc kubenswrapper[4990]: I1003 10:39:55.303823 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:39:55 crc kubenswrapper[4990]: I1003 10:39:55.304442 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:40:25 crc kubenswrapper[4990]: I1003 10:40:25.304470 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:40:25 crc kubenswrapper[4990]: I1003 10:40:25.305717 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:40:55 crc kubenswrapper[4990]: I1003 10:40:55.304194 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:40:55 crc kubenswrapper[4990]: I1003 10:40:55.304878 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:40:55 crc kubenswrapper[4990]: I1003 10:40:55.304943 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:40:55 crc kubenswrapper[4990]: I1003 10:40:55.305799 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e229b314a8e85bcc03a52d706a25d2fecb620f8876f20b701842ea7a6376ed8"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:40:55 crc kubenswrapper[4990]: I1003 10:40:55.305927 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://3e229b314a8e85bcc03a52d706a25d2fecb620f8876f20b701842ea7a6376ed8" gracePeriod=600 Oct 03 10:40:56 crc kubenswrapper[4990]: I1003 10:40:56.291053 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="3e229b314a8e85bcc03a52d706a25d2fecb620f8876f20b701842ea7a6376ed8" exitCode=0 Oct 03 10:40:56 crc kubenswrapper[4990]: I1003 10:40:56.291129 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"3e229b314a8e85bcc03a52d706a25d2fecb620f8876f20b701842ea7a6376ed8"} Oct 03 10:40:56 crc kubenswrapper[4990]: I1003 10:40:56.291400 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4"} Oct 03 10:40:56 crc kubenswrapper[4990]: I1003 10:40:56.291426 4990 scope.go:117] "RemoveContainer" containerID="c6456269526b08e2207c8ce5a80d9829c15affdcfedf31beecc8383d4fc74bac" Oct 03 10:42:55 crc kubenswrapper[4990]: I1003 10:42:55.303385 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:42:55 crc kubenswrapper[4990]: I1003 10:42:55.304021 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:43:25 crc kubenswrapper[4990]: I1003 10:43:25.303664 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:43:25 crc kubenswrapper[4990]: I1003 10:43:25.304308 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.304294 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.305001 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.305066 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.306138 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.306229 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" gracePeriod=600 Oct 03 10:43:55 crc kubenswrapper[4990]: E1003 10:43:55.432486 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.855065 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" exitCode=0 Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.855131 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4"} Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.855177 4990 scope.go:117] "RemoveContainer" containerID="3e229b314a8e85bcc03a52d706a25d2fecb620f8876f20b701842ea7a6376ed8" Oct 03 10:43:55 crc kubenswrapper[4990]: I1003 10:43:55.855930 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:43:55 crc kubenswrapper[4990]: E1003 10:43:55.856279 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.227857 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6q9q7"] Oct 03 10:43:57 crc kubenswrapper[4990]: E1003 10:43:57.228199 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerName="extract-utilities" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.228216 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerName="extract-utilities" Oct 03 10:43:57 crc kubenswrapper[4990]: E1003 10:43:57.228235 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerName="extract-content" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.228243 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerName="extract-content" Oct 03 10:43:57 crc kubenswrapper[4990]: E1003 10:43:57.228256 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d62848-592d-4533-90c1-78ab32eb2457" containerName="extract-utilities" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.228264 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d62848-592d-4533-90c1-78ab32eb2457" containerName="extract-utilities" Oct 03 10:43:57 crc kubenswrapper[4990]: E1003 10:43:57.228293 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerName="registry-server" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.228300 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerName="registry-server" Oct 03 10:43:57 crc kubenswrapper[4990]: E1003 10:43:57.228309 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d62848-592d-4533-90c1-78ab32eb2457" containerName="extract-content" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.228316 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d62848-592d-4533-90c1-78ab32eb2457" containerName="extract-content" Oct 03 10:43:57 crc kubenswrapper[4990]: E1003 10:43:57.228332 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d62848-592d-4533-90c1-78ab32eb2457" containerName="registry-server" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.228340 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d62848-592d-4533-90c1-78ab32eb2457" containerName="registry-server" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.228542 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b97adc-860d-4382-9b4b-1da0c57aaf70" containerName="registry-server" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.228556 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d62848-592d-4533-90c1-78ab32eb2457" containerName="registry-server" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.229589 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.247318 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q9q7"] Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.354348 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-catalog-content\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.354921 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5g7b\" (UniqueName: \"kubernetes.io/projected/55859b2a-35f6-430c-8f52-601abcc4a25f-kube-api-access-w5g7b\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.355115 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-utilities\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.460228 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-utilities\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.460298 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-catalog-content\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.460322 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5g7b\" (UniqueName: \"kubernetes.io/projected/55859b2a-35f6-430c-8f52-601abcc4a25f-kube-api-access-w5g7b\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.460926 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-catalog-content\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.461002 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-utilities\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.482562 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5g7b\" (UniqueName: \"kubernetes.io/projected/55859b2a-35f6-430c-8f52-601abcc4a25f-kube-api-access-w5g7b\") pod \"community-operators-6q9q7\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:57 crc kubenswrapper[4990]: I1003 10:43:57.547738 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:43:58 crc kubenswrapper[4990]: I1003 10:43:58.035121 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q9q7"] Oct 03 10:43:58 crc kubenswrapper[4990]: I1003 10:43:58.892260 4990 generic.go:334] "Generic (PLEG): container finished" podID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerID="495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4" exitCode=0 Oct 03 10:43:58 crc kubenswrapper[4990]: I1003 10:43:58.892443 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q9q7" event={"ID":"55859b2a-35f6-430c-8f52-601abcc4a25f","Type":"ContainerDied","Data":"495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4"} Oct 03 10:43:58 crc kubenswrapper[4990]: I1003 10:43:58.892654 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q9q7" event={"ID":"55859b2a-35f6-430c-8f52-601abcc4a25f","Type":"ContainerStarted","Data":"6564f01abc010dc3bbdcc4d6b6c243047f262de5e34a42d7c41efc8d51d26873"} Oct 03 10:43:58 crc kubenswrapper[4990]: I1003 10:43:58.896068 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:43:59 crc kubenswrapper[4990]: I1003 10:43:59.905264 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q9q7" event={"ID":"55859b2a-35f6-430c-8f52-601abcc4a25f","Type":"ContainerStarted","Data":"70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11"} Oct 03 10:44:00 crc kubenswrapper[4990]: I1003 10:44:00.917546 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q9q7" event={"ID":"55859b2a-35f6-430c-8f52-601abcc4a25f","Type":"ContainerDied","Data":"70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11"} Oct 03 10:44:00 crc kubenswrapper[4990]: I1003 10:44:00.917495 4990 generic.go:334] "Generic (PLEG): container finished" podID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerID="70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11" exitCode=0 Oct 03 10:44:01 crc kubenswrapper[4990]: I1003 10:44:01.931990 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q9q7" event={"ID":"55859b2a-35f6-430c-8f52-601abcc4a25f","Type":"ContainerStarted","Data":"5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9"} Oct 03 10:44:01 crc kubenswrapper[4990]: I1003 10:44:01.956625 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6q9q7" podStartSLOduration=2.367534833 podStartE2EDuration="4.956603331s" podCreationTimestamp="2025-10-03 10:43:57 +0000 UTC" firstStartedPulling="2025-10-03 10:43:58.895807232 +0000 UTC m=+3620.692439099" lastFinishedPulling="2025-10-03 10:44:01.4848757 +0000 UTC m=+3623.281507597" observedRunningTime="2025-10-03 10:44:01.950325998 +0000 UTC m=+3623.746957855" watchObservedRunningTime="2025-10-03 10:44:01.956603331 +0000 UTC m=+3623.753235188" Oct 03 10:44:06 crc kubenswrapper[4990]: I1003 10:44:06.872139 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:44:06 crc kubenswrapper[4990]: E1003 10:44:06.872442 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:44:07 crc kubenswrapper[4990]: I1003 10:44:07.548311 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:44:07 crc kubenswrapper[4990]: I1003 10:44:07.548353 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:44:07 crc kubenswrapper[4990]: I1003 10:44:07.609545 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:44:08 crc kubenswrapper[4990]: I1003 10:44:08.035208 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:44:08 crc kubenswrapper[4990]: I1003 10:44:08.092742 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q9q7"] Oct 03 10:44:09 crc kubenswrapper[4990]: I1003 10:44:09.995554 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6q9q7" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerName="registry-server" containerID="cri-o://5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9" gracePeriod=2 Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.404606 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.523762 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-catalog-content\") pod \"55859b2a-35f6-430c-8f52-601abcc4a25f\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.523864 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5g7b\" (UniqueName: \"kubernetes.io/projected/55859b2a-35f6-430c-8f52-601abcc4a25f-kube-api-access-w5g7b\") pod \"55859b2a-35f6-430c-8f52-601abcc4a25f\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.523906 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-utilities\") pod \"55859b2a-35f6-430c-8f52-601abcc4a25f\" (UID: \"55859b2a-35f6-430c-8f52-601abcc4a25f\") " Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.525400 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-utilities" (OuterVolumeSpecName: "utilities") pod "55859b2a-35f6-430c-8f52-601abcc4a25f" (UID: "55859b2a-35f6-430c-8f52-601abcc4a25f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.530749 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55859b2a-35f6-430c-8f52-601abcc4a25f-kube-api-access-w5g7b" (OuterVolumeSpecName: "kube-api-access-w5g7b") pod "55859b2a-35f6-430c-8f52-601abcc4a25f" (UID: "55859b2a-35f6-430c-8f52-601abcc4a25f"). InnerVolumeSpecName "kube-api-access-w5g7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.609106 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55859b2a-35f6-430c-8f52-601abcc4a25f" (UID: "55859b2a-35f6-430c-8f52-601abcc4a25f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.625347 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.625380 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5g7b\" (UniqueName: \"kubernetes.io/projected/55859b2a-35f6-430c-8f52-601abcc4a25f-kube-api-access-w5g7b\") on node \"crc\" DevicePath \"\"" Oct 03 10:44:10 crc kubenswrapper[4990]: I1003 10:44:10.625396 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55859b2a-35f6-430c-8f52-601abcc4a25f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.007968 4990 generic.go:334] "Generic (PLEG): container finished" podID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerID="5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9" exitCode=0 Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.008033 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q9q7" event={"ID":"55859b2a-35f6-430c-8f52-601abcc4a25f","Type":"ContainerDied","Data":"5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9"} Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.008073 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q9q7" event={"ID":"55859b2a-35f6-430c-8f52-601abcc4a25f","Type":"ContainerDied","Data":"6564f01abc010dc3bbdcc4d6b6c243047f262de5e34a42d7c41efc8d51d26873"} Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.008097 4990 scope.go:117] "RemoveContainer" containerID="5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.008157 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q9q7" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.061589 4990 scope.go:117] "RemoveContainer" containerID="70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.092292 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q9q7"] Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.102778 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6q9q7"] Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.120614 4990 scope.go:117] "RemoveContainer" containerID="495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.139652 4990 scope.go:117] "RemoveContainer" containerID="5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9" Oct 03 10:44:11 crc kubenswrapper[4990]: E1003 10:44:11.140775 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9\": container with ID starting with 5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9 not found: ID does not exist" containerID="5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.140840 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9"} err="failed to get container status \"5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9\": rpc error: code = NotFound desc = could not find container \"5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9\": container with ID starting with 5a093193322b5c5d27e38f757fa931f3db6f7375566bbbf0518537333b4735b9 not found: ID does not exist" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.140879 4990 scope.go:117] "RemoveContainer" containerID="70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11" Oct 03 10:44:11 crc kubenswrapper[4990]: E1003 10:44:11.141138 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11\": container with ID starting with 70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11 not found: ID does not exist" containerID="70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.141165 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11"} err="failed to get container status \"70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11\": rpc error: code = NotFound desc = could not find container \"70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11\": container with ID starting with 70dfc23d4e5483e4809160d48d718201a4736cdbdbb4e9f382723085cbf2df11 not found: ID does not exist" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.141182 4990 scope.go:117] "RemoveContainer" containerID="495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4" Oct 03 10:44:11 crc kubenswrapper[4990]: E1003 10:44:11.141461 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4\": container with ID starting with 495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4 not found: ID does not exist" containerID="495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4" Oct 03 10:44:11 crc kubenswrapper[4990]: I1003 10:44:11.141488 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4"} err="failed to get container status \"495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4\": rpc error: code = NotFound desc = could not find container \"495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4\": container with ID starting with 495d661b4cbce7cbfa3686bdc8866ed50e64a3bc8df4a258751846fcd6510fb4 not found: ID does not exist" Oct 03 10:44:12 crc kubenswrapper[4990]: I1003 10:44:12.882621 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" path="/var/lib/kubelet/pods/55859b2a-35f6-430c-8f52-601abcc4a25f/volumes" Oct 03 10:44:19 crc kubenswrapper[4990]: I1003 10:44:19.872148 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:44:19 crc kubenswrapper[4990]: E1003 10:44:19.873305 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:44:34 crc kubenswrapper[4990]: I1003 10:44:34.872926 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:44:34 crc kubenswrapper[4990]: E1003 10:44:34.874119 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:44:49 crc kubenswrapper[4990]: I1003 10:44:49.872109 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:44:49 crc kubenswrapper[4990]: E1003 10:44:49.874032 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.190729 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482"] Oct 03 10:45:00 crc kubenswrapper[4990]: E1003 10:45:00.191642 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerName="extract-utilities" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.191657 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerName="extract-utilities" Oct 03 10:45:00 crc kubenswrapper[4990]: E1003 10:45:00.191692 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerName="extract-content" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.191700 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerName="extract-content" Oct 03 10:45:00 crc kubenswrapper[4990]: E1003 10:45:00.191715 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerName="registry-server" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.191723 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerName="registry-server" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.191897 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="55859b2a-35f6-430c-8f52-601abcc4a25f" containerName="registry-server" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.192446 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.195234 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.195599 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.201594 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482"] Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.316534 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56952dbd-209c-4fad-adfe-ea5dc0a0c349-config-volume\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.316649 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwgrm\" (UniqueName: \"kubernetes.io/projected/56952dbd-209c-4fad-adfe-ea5dc0a0c349-kube-api-access-vwgrm\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.316695 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56952dbd-209c-4fad-adfe-ea5dc0a0c349-secret-volume\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.417577 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56952dbd-209c-4fad-adfe-ea5dc0a0c349-secret-volume\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.417681 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56952dbd-209c-4fad-adfe-ea5dc0a0c349-config-volume\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.417726 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwgrm\" (UniqueName: \"kubernetes.io/projected/56952dbd-209c-4fad-adfe-ea5dc0a0c349-kube-api-access-vwgrm\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.418623 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56952dbd-209c-4fad-adfe-ea5dc0a0c349-config-volume\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.423941 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56952dbd-209c-4fad-adfe-ea5dc0a0c349-secret-volume\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.434779 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwgrm\" (UniqueName: \"kubernetes.io/projected/56952dbd-209c-4fad-adfe-ea5dc0a0c349-kube-api-access-vwgrm\") pod \"collect-profiles-29324805-v4482\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.514348 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:00 crc kubenswrapper[4990]: I1003 10:45:00.775199 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482"] Oct 03 10:45:01 crc kubenswrapper[4990]: I1003 10:45:01.469222 4990 generic.go:334] "Generic (PLEG): container finished" podID="56952dbd-209c-4fad-adfe-ea5dc0a0c349" containerID="b28df4a58f97f2ffcc59aa61b5b4f893608704a164f93fc76d33c812bef6d2ab" exitCode=0 Oct 03 10:45:01 crc kubenswrapper[4990]: I1003 10:45:01.469573 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" event={"ID":"56952dbd-209c-4fad-adfe-ea5dc0a0c349","Type":"ContainerDied","Data":"b28df4a58f97f2ffcc59aa61b5b4f893608704a164f93fc76d33c812bef6d2ab"} Oct 03 10:45:01 crc kubenswrapper[4990]: I1003 10:45:01.469616 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" event={"ID":"56952dbd-209c-4fad-adfe-ea5dc0a0c349","Type":"ContainerStarted","Data":"54562085c849434ec9679d1b62941b8b408f4a9badb0cc70e205eafd28c6dbfa"} Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.722921 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66q7z"] Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.724841 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.728452 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66q7z"] Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.816942 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.872320 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.873995 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-utilities\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.874188 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p8cj\" (UniqueName: \"kubernetes.io/projected/14eb2f60-4080-4ea9-aab2-1aa2dafda347-kube-api-access-4p8cj\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.874393 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-catalog-content\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: E1003 10:45:02.875998 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.975487 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56952dbd-209c-4fad-adfe-ea5dc0a0c349-secret-volume\") pod \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.975614 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56952dbd-209c-4fad-adfe-ea5dc0a0c349-config-volume\") pod \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.975651 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwgrm\" (UniqueName: \"kubernetes.io/projected/56952dbd-209c-4fad-adfe-ea5dc0a0c349-kube-api-access-vwgrm\") pod \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\" (UID: \"56952dbd-209c-4fad-adfe-ea5dc0a0c349\") " Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.975790 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-catalog-content\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.976003 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-utilities\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.976042 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p8cj\" (UniqueName: \"kubernetes.io/projected/14eb2f60-4080-4ea9-aab2-1aa2dafda347-kube-api-access-4p8cj\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.976731 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-catalog-content\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.976723 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-utilities\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.977317 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56952dbd-209c-4fad-adfe-ea5dc0a0c349-config-volume" (OuterVolumeSpecName: "config-volume") pod "56952dbd-209c-4fad-adfe-ea5dc0a0c349" (UID: "56952dbd-209c-4fad-adfe-ea5dc0a0c349"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.983176 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56952dbd-209c-4fad-adfe-ea5dc0a0c349-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "56952dbd-209c-4fad-adfe-ea5dc0a0c349" (UID: "56952dbd-209c-4fad-adfe-ea5dc0a0c349"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:45:02 crc kubenswrapper[4990]: I1003 10:45:02.994336 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56952dbd-209c-4fad-adfe-ea5dc0a0c349-kube-api-access-vwgrm" (OuterVolumeSpecName: "kube-api-access-vwgrm") pod "56952dbd-209c-4fad-adfe-ea5dc0a0c349" (UID: "56952dbd-209c-4fad-adfe-ea5dc0a0c349"). InnerVolumeSpecName "kube-api-access-vwgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.002316 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p8cj\" (UniqueName: \"kubernetes.io/projected/14eb2f60-4080-4ea9-aab2-1aa2dafda347-kube-api-access-4p8cj\") pod \"redhat-marketplace-66q7z\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.049895 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.077195 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56952dbd-209c-4fad-adfe-ea5dc0a0c349-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.077382 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwgrm\" (UniqueName: \"kubernetes.io/projected/56952dbd-209c-4fad-adfe-ea5dc0a0c349-kube-api-access-vwgrm\") on node \"crc\" DevicePath \"\"" Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.077584 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56952dbd-209c-4fad-adfe-ea5dc0a0c349-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.240164 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66q7z"] Oct 03 10:45:03 crc kubenswrapper[4990]: W1003 10:45:03.247294 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14eb2f60_4080_4ea9_aab2_1aa2dafda347.slice/crio-a9f3b51ea8c247922ed29e8d3c8deaf69d5d8769f6218cc8d95235c70442a0e9 WatchSource:0}: Error finding container a9f3b51ea8c247922ed29e8d3c8deaf69d5d8769f6218cc8d95235c70442a0e9: Status 404 returned error can't find the container with id a9f3b51ea8c247922ed29e8d3c8deaf69d5d8769f6218cc8d95235c70442a0e9 Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.491262 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.491306 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482" event={"ID":"56952dbd-209c-4fad-adfe-ea5dc0a0c349","Type":"ContainerDied","Data":"54562085c849434ec9679d1b62941b8b408f4a9badb0cc70e205eafd28c6dbfa"} Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.491370 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54562085c849434ec9679d1b62941b8b408f4a9badb0cc70e205eafd28c6dbfa" Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.495917 4990 generic.go:334] "Generic (PLEG): container finished" podID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerID="47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317" exitCode=0 Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.495978 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66q7z" event={"ID":"14eb2f60-4080-4ea9-aab2-1aa2dafda347","Type":"ContainerDied","Data":"47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317"} Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.496012 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66q7z" event={"ID":"14eb2f60-4080-4ea9-aab2-1aa2dafda347","Type":"ContainerStarted","Data":"a9f3b51ea8c247922ed29e8d3c8deaf69d5d8769f6218cc8d95235c70442a0e9"} Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.887080 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m"] Oct 03 10:45:03 crc kubenswrapper[4990]: I1003 10:45:03.893733 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324760-9hf2m"] Oct 03 10:45:04 crc kubenswrapper[4990]: I1003 10:45:04.506470 4990 generic.go:334] "Generic (PLEG): container finished" podID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerID="2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496" exitCode=0 Oct 03 10:45:04 crc kubenswrapper[4990]: I1003 10:45:04.506580 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66q7z" event={"ID":"14eb2f60-4080-4ea9-aab2-1aa2dafda347","Type":"ContainerDied","Data":"2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496"} Oct 03 10:45:04 crc kubenswrapper[4990]: I1003 10:45:04.884536 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7270bda0-0176-495e-8111-623e25e97ec6" path="/var/lib/kubelet/pods/7270bda0-0176-495e-8111-623e25e97ec6/volumes" Oct 03 10:45:05 crc kubenswrapper[4990]: I1003 10:45:05.525965 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66q7z" event={"ID":"14eb2f60-4080-4ea9-aab2-1aa2dafda347","Type":"ContainerStarted","Data":"8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c"} Oct 03 10:45:05 crc kubenswrapper[4990]: I1003 10:45:05.550062 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66q7z" podStartSLOduration=1.8445449269999998 podStartE2EDuration="3.550046026s" podCreationTimestamp="2025-10-03 10:45:02 +0000 UTC" firstStartedPulling="2025-10-03 10:45:03.498840523 +0000 UTC m=+3685.295472420" lastFinishedPulling="2025-10-03 10:45:05.204341662 +0000 UTC m=+3687.000973519" observedRunningTime="2025-10-03 10:45:05.544081031 +0000 UTC m=+3687.340712908" watchObservedRunningTime="2025-10-03 10:45:05.550046026 +0000 UTC m=+3687.346677883" Oct 03 10:45:13 crc kubenswrapper[4990]: I1003 10:45:13.050866 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:13 crc kubenswrapper[4990]: I1003 10:45:13.051403 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:13 crc kubenswrapper[4990]: I1003 10:45:13.095687 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:13 crc kubenswrapper[4990]: I1003 10:45:13.685339 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:13 crc kubenswrapper[4990]: I1003 10:45:13.753448 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66q7z"] Oct 03 10:45:13 crc kubenswrapper[4990]: I1003 10:45:13.871721 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:45:13 crc kubenswrapper[4990]: E1003 10:45:13.872096 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:45:15 crc kubenswrapper[4990]: I1003 10:45:15.626339 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-66q7z" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerName="registry-server" containerID="cri-o://8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c" gracePeriod=2 Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.032752 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.145821 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-utilities\") pod \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.145917 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p8cj\" (UniqueName: \"kubernetes.io/projected/14eb2f60-4080-4ea9-aab2-1aa2dafda347-kube-api-access-4p8cj\") pod \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.145985 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-catalog-content\") pod \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\" (UID: \"14eb2f60-4080-4ea9-aab2-1aa2dafda347\") " Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.146703 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-utilities" (OuterVolumeSpecName: "utilities") pod "14eb2f60-4080-4ea9-aab2-1aa2dafda347" (UID: "14eb2f60-4080-4ea9-aab2-1aa2dafda347"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.152757 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14eb2f60-4080-4ea9-aab2-1aa2dafda347-kube-api-access-4p8cj" (OuterVolumeSpecName: "kube-api-access-4p8cj") pod "14eb2f60-4080-4ea9-aab2-1aa2dafda347" (UID: "14eb2f60-4080-4ea9-aab2-1aa2dafda347"). InnerVolumeSpecName "kube-api-access-4p8cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.161080 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14eb2f60-4080-4ea9-aab2-1aa2dafda347" (UID: "14eb2f60-4080-4ea9-aab2-1aa2dafda347"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.247825 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p8cj\" (UniqueName: \"kubernetes.io/projected/14eb2f60-4080-4ea9-aab2-1aa2dafda347-kube-api-access-4p8cj\") on node \"crc\" DevicePath \"\"" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.247859 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.247867 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14eb2f60-4080-4ea9-aab2-1aa2dafda347-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.639257 4990 generic.go:334] "Generic (PLEG): container finished" podID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerID="8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c" exitCode=0 Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.639312 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66q7z" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.639342 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66q7z" event={"ID":"14eb2f60-4080-4ea9-aab2-1aa2dafda347","Type":"ContainerDied","Data":"8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c"} Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.640144 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66q7z" event={"ID":"14eb2f60-4080-4ea9-aab2-1aa2dafda347","Type":"ContainerDied","Data":"a9f3b51ea8c247922ed29e8d3c8deaf69d5d8769f6218cc8d95235c70442a0e9"} Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.640168 4990 scope.go:117] "RemoveContainer" containerID="8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.657461 4990 scope.go:117] "RemoveContainer" containerID="2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.674884 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66q7z"] Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.687166 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-66q7z"] Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.694320 4990 scope.go:117] "RemoveContainer" containerID="47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.716441 4990 scope.go:117] "RemoveContainer" containerID="8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c" Oct 03 10:45:16 crc kubenswrapper[4990]: E1003 10:45:16.717124 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c\": container with ID starting with 8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c not found: ID does not exist" containerID="8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.717192 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c"} err="failed to get container status \"8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c\": rpc error: code = NotFound desc = could not find container \"8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c\": container with ID starting with 8f5e32a9f18f0139b258cd569411201ee71b05098d4a1046686b95502f006f1c not found: ID does not exist" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.717215 4990 scope.go:117] "RemoveContainer" containerID="2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496" Oct 03 10:45:16 crc kubenswrapper[4990]: E1003 10:45:16.717565 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496\": container with ID starting with 2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496 not found: ID does not exist" containerID="2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.717598 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496"} err="failed to get container status \"2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496\": rpc error: code = NotFound desc = could not find container \"2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496\": container with ID starting with 2fd26575ce56aa809384f1472a04f700ecf25366b712206009ea394ba7d19496 not found: ID does not exist" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.717617 4990 scope.go:117] "RemoveContainer" containerID="47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317" Oct 03 10:45:16 crc kubenswrapper[4990]: E1003 10:45:16.717848 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317\": container with ID starting with 47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317 not found: ID does not exist" containerID="47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.717881 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317"} err="failed to get container status \"47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317\": rpc error: code = NotFound desc = could not find container \"47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317\": container with ID starting with 47508568dddbbe410fbc279246f705c861cd35b55fc69f7d0da590362b121317 not found: ID does not exist" Oct 03 10:45:16 crc kubenswrapper[4990]: I1003 10:45:16.880142 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" path="/var/lib/kubelet/pods/14eb2f60-4080-4ea9-aab2-1aa2dafda347/volumes" Oct 03 10:45:25 crc kubenswrapper[4990]: I1003 10:45:25.871576 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:45:25 crc kubenswrapper[4990]: E1003 10:45:25.872742 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:45:37 crc kubenswrapper[4990]: I1003 10:45:37.872374 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:45:37 crc kubenswrapper[4990]: E1003 10:45:37.873229 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:45:46 crc kubenswrapper[4990]: I1003 10:45:46.888428 4990 scope.go:117] "RemoveContainer" containerID="86f066598dfbcc192921b0771831b7c09f2cd78f86c7fe9092a8a99ca20ac41b" Oct 03 10:45:51 crc kubenswrapper[4990]: I1003 10:45:51.875414 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:45:51 crc kubenswrapper[4990]: E1003 10:45:51.876231 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:46:04 crc kubenswrapper[4990]: I1003 10:46:04.872524 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:46:04 crc kubenswrapper[4990]: E1003 10:46:04.873333 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:46:19 crc kubenswrapper[4990]: I1003 10:46:19.871794 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:46:19 crc kubenswrapper[4990]: E1003 10:46:19.873221 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:46:31 crc kubenswrapper[4990]: I1003 10:46:31.871836 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:46:31 crc kubenswrapper[4990]: E1003 10:46:31.872789 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:46:43 crc kubenswrapper[4990]: I1003 10:46:43.872090 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:46:43 crc kubenswrapper[4990]: E1003 10:46:43.874013 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:46:57 crc kubenswrapper[4990]: I1003 10:46:57.871892 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:46:57 crc kubenswrapper[4990]: E1003 10:46:57.872896 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:47:12 crc kubenswrapper[4990]: I1003 10:47:12.878166 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:47:12 crc kubenswrapper[4990]: E1003 10:47:12.879987 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:47:26 crc kubenswrapper[4990]: I1003 10:47:26.872683 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:47:26 crc kubenswrapper[4990]: E1003 10:47:26.874085 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:47:38 crc kubenswrapper[4990]: I1003 10:47:38.876469 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:47:38 crc kubenswrapper[4990]: E1003 10:47:38.877646 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:47:52 crc kubenswrapper[4990]: I1003 10:47:52.871683 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:47:52 crc kubenswrapper[4990]: E1003 10:47:52.872568 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:48:03 crc kubenswrapper[4990]: I1003 10:48:03.871850 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:48:03 crc kubenswrapper[4990]: E1003 10:48:03.873672 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:48:17 crc kubenswrapper[4990]: I1003 10:48:17.872479 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:48:17 crc kubenswrapper[4990]: E1003 10:48:17.873461 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:48:31 crc kubenswrapper[4990]: I1003 10:48:31.872689 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:48:31 crc kubenswrapper[4990]: E1003 10:48:31.873769 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:48:42 crc kubenswrapper[4990]: I1003 10:48:42.871618 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:48:42 crc kubenswrapper[4990]: E1003 10:48:42.872498 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:48:56 crc kubenswrapper[4990]: I1003 10:48:56.872257 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:48:57 crc kubenswrapper[4990]: I1003 10:48:57.423916 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"8a6ec41b995a2eb5e892eac05f8dddb37bc968bc5110dd7e700194d6288bc63a"} Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.002241 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4jrtc"] Oct 03 10:49:47 crc kubenswrapper[4990]: E1003 10:49:47.005931 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerName="extract-utilities" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.005976 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerName="extract-utilities" Oct 03 10:49:47 crc kubenswrapper[4990]: E1003 10:49:47.005995 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56952dbd-209c-4fad-adfe-ea5dc0a0c349" containerName="collect-profiles" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.006006 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="56952dbd-209c-4fad-adfe-ea5dc0a0c349" containerName="collect-profiles" Oct 03 10:49:47 crc kubenswrapper[4990]: E1003 10:49:47.006018 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerName="registry-server" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.006027 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerName="registry-server" Oct 03 10:49:47 crc kubenswrapper[4990]: E1003 10:49:47.006087 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerName="extract-content" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.006098 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerName="extract-content" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.006280 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="56952dbd-209c-4fad-adfe-ea5dc0a0c349" containerName="collect-profiles" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.006301 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="14eb2f60-4080-4ea9-aab2-1aa2dafda347" containerName="registry-server" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.007609 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.021636 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4jrtc"] Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.084847 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9vfs\" (UniqueName: \"kubernetes.io/projected/8c2c72f5-5150-4661-a205-5be4b412b7ae-kube-api-access-n9vfs\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.084917 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-catalog-content\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.084955 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-utilities\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.186370 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-utilities\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.186462 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9vfs\" (UniqueName: \"kubernetes.io/projected/8c2c72f5-5150-4661-a205-5be4b412b7ae-kube-api-access-n9vfs\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.186498 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-catalog-content\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.187059 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-catalog-content\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.187222 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-utilities\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.205598 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9vfs\" (UniqueName: \"kubernetes.io/projected/8c2c72f5-5150-4661-a205-5be4b412b7ae-kube-api-access-n9vfs\") pod \"redhat-operators-4jrtc\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.325765 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.780953 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4jrtc"] Oct 03 10:49:47 crc kubenswrapper[4990]: W1003 10:49:47.782714 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2c72f5_5150_4661_a205_5be4b412b7ae.slice/crio-b98c7c98dde1ac8d2e246fee0135fb173cdfebf68fb8f9aa94d2538d035eef0d WatchSource:0}: Error finding container b98c7c98dde1ac8d2e246fee0135fb173cdfebf68fb8f9aa94d2538d035eef0d: Status 404 returned error can't find the container with id b98c7c98dde1ac8d2e246fee0135fb173cdfebf68fb8f9aa94d2538d035eef0d Oct 03 10:49:47 crc kubenswrapper[4990]: I1003 10:49:47.887838 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jrtc" event={"ID":"8c2c72f5-5150-4661-a205-5be4b412b7ae","Type":"ContainerStarted","Data":"b98c7c98dde1ac8d2e246fee0135fb173cdfebf68fb8f9aa94d2538d035eef0d"} Oct 03 10:49:48 crc kubenswrapper[4990]: I1003 10:49:48.897298 4990 generic.go:334] "Generic (PLEG): container finished" podID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerID="7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610" exitCode=0 Oct 03 10:49:48 crc kubenswrapper[4990]: I1003 10:49:48.897394 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jrtc" event={"ID":"8c2c72f5-5150-4661-a205-5be4b412b7ae","Type":"ContainerDied","Data":"7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610"} Oct 03 10:49:48 crc kubenswrapper[4990]: I1003 10:49:48.899752 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:49:50 crc kubenswrapper[4990]: I1003 10:49:50.914906 4990 generic.go:334] "Generic (PLEG): container finished" podID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerID="6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c" exitCode=0 Oct 03 10:49:50 crc kubenswrapper[4990]: I1003 10:49:50.915013 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jrtc" event={"ID":"8c2c72f5-5150-4661-a205-5be4b412b7ae","Type":"ContainerDied","Data":"6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c"} Oct 03 10:49:51 crc kubenswrapper[4990]: I1003 10:49:51.923226 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jrtc" event={"ID":"8c2c72f5-5150-4661-a205-5be4b412b7ae","Type":"ContainerStarted","Data":"073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a"} Oct 03 10:49:51 crc kubenswrapper[4990]: I1003 10:49:51.948502 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4jrtc" podStartSLOduration=3.540432226 podStartE2EDuration="5.9484786s" podCreationTimestamp="2025-10-03 10:49:46 +0000 UTC" firstStartedPulling="2025-10-03 10:49:48.899334474 +0000 UTC m=+3970.695966321" lastFinishedPulling="2025-10-03 10:49:51.307380798 +0000 UTC m=+3973.104012695" observedRunningTime="2025-10-03 10:49:51.945666617 +0000 UTC m=+3973.742298534" watchObservedRunningTime="2025-10-03 10:49:51.9484786 +0000 UTC m=+3973.745110457" Oct 03 10:49:57 crc kubenswrapper[4990]: I1003 10:49:57.327363 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:57 crc kubenswrapper[4990]: I1003 10:49:57.327940 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:57 crc kubenswrapper[4990]: I1003 10:49:57.376480 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:58 crc kubenswrapper[4990]: I1003 10:49:58.022380 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:49:58 crc kubenswrapper[4990]: I1003 10:49:58.083917 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jrtc"] Oct 03 10:49:59 crc kubenswrapper[4990]: I1003 10:49:59.984965 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4jrtc" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerName="registry-server" containerID="cri-o://073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a" gracePeriod=2 Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.464159 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.592043 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9vfs\" (UniqueName: \"kubernetes.io/projected/8c2c72f5-5150-4661-a205-5be4b412b7ae-kube-api-access-n9vfs\") pod \"8c2c72f5-5150-4661-a205-5be4b412b7ae\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.592151 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-catalog-content\") pod \"8c2c72f5-5150-4661-a205-5be4b412b7ae\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.592203 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-utilities\") pod \"8c2c72f5-5150-4661-a205-5be4b412b7ae\" (UID: \"8c2c72f5-5150-4661-a205-5be4b412b7ae\") " Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.593363 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-utilities" (OuterVolumeSpecName: "utilities") pod "8c2c72f5-5150-4661-a205-5be4b412b7ae" (UID: "8c2c72f5-5150-4661-a205-5be4b412b7ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.599801 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c2c72f5-5150-4661-a205-5be4b412b7ae-kube-api-access-n9vfs" (OuterVolumeSpecName: "kube-api-access-n9vfs") pod "8c2c72f5-5150-4661-a205-5be4b412b7ae" (UID: "8c2c72f5-5150-4661-a205-5be4b412b7ae"). InnerVolumeSpecName "kube-api-access-n9vfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.693634 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9vfs\" (UniqueName: \"kubernetes.io/projected/8c2c72f5-5150-4661-a205-5be4b412b7ae-kube-api-access-n9vfs\") on node \"crc\" DevicePath \"\"" Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.693897 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.996668 4990 generic.go:334] "Generic (PLEG): container finished" podID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerID="073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a" exitCode=0 Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.996711 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jrtc" event={"ID":"8c2c72f5-5150-4661-a205-5be4b412b7ae","Type":"ContainerDied","Data":"073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a"} Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.996741 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4jrtc" event={"ID":"8c2c72f5-5150-4661-a205-5be4b412b7ae","Type":"ContainerDied","Data":"b98c7c98dde1ac8d2e246fee0135fb173cdfebf68fb8f9aa94d2538d035eef0d"} Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.996759 4990 scope.go:117] "RemoveContainer" containerID="073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a" Oct 03 10:50:00 crc kubenswrapper[4990]: I1003 10:50:00.996800 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4jrtc" Oct 03 10:50:01 crc kubenswrapper[4990]: I1003 10:50:01.027095 4990 scope.go:117] "RemoveContainer" containerID="6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c" Oct 03 10:50:01 crc kubenswrapper[4990]: I1003 10:50:01.068993 4990 scope.go:117] "RemoveContainer" containerID="7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610" Oct 03 10:50:01 crc kubenswrapper[4990]: I1003 10:50:01.084459 4990 scope.go:117] "RemoveContainer" containerID="073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a" Oct 03 10:50:01 crc kubenswrapper[4990]: E1003 10:50:01.084971 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a\": container with ID starting with 073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a not found: ID does not exist" containerID="073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a" Oct 03 10:50:01 crc kubenswrapper[4990]: I1003 10:50:01.085028 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a"} err="failed to get container status \"073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a\": rpc error: code = NotFound desc = could not find container \"073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a\": container with ID starting with 073ee10e8661d6c98bf10a8d0b4a7c54f903e32e4022d5145966d6f5bdfea75a not found: ID does not exist" Oct 03 10:50:01 crc kubenswrapper[4990]: I1003 10:50:01.085062 4990 scope.go:117] "RemoveContainer" containerID="6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c" Oct 03 10:50:01 crc kubenswrapper[4990]: E1003 10:50:01.085553 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c\": container with ID starting with 6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c not found: ID does not exist" containerID="6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c" Oct 03 10:50:01 crc kubenswrapper[4990]: I1003 10:50:01.085585 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c"} err="failed to get container status \"6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c\": rpc error: code = NotFound desc = could not find container \"6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c\": container with ID starting with 6b71968a4554038e0cb7ba5fe8f87c85410d6d9996ff4155f1cb158251d7878c not found: ID does not exist" Oct 03 10:50:01 crc kubenswrapper[4990]: I1003 10:50:01.085607 4990 scope.go:117] "RemoveContainer" containerID="7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610" Oct 03 10:50:01 crc kubenswrapper[4990]: E1003 10:50:01.085974 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610\": container with ID starting with 7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610 not found: ID does not exist" containerID="7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610" Oct 03 10:50:01 crc kubenswrapper[4990]: I1003 10:50:01.086002 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610"} err="failed to get container status \"7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610\": rpc error: code = NotFound desc = could not find container \"7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610\": container with ID starting with 7905826c5f067805ceb34084f83c2b6e2c30533c21d87e6305ce43ab800f7610 not found: ID does not exist" Oct 03 10:50:02 crc kubenswrapper[4990]: I1003 10:50:02.104264 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c2c72f5-5150-4661-a205-5be4b412b7ae" (UID: "8c2c72f5-5150-4661-a205-5be4b412b7ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:50:02 crc kubenswrapper[4990]: I1003 10:50:02.130717 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c2c72f5-5150-4661-a205-5be4b412b7ae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:50:02 crc kubenswrapper[4990]: I1003 10:50:02.226098 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4jrtc"] Oct 03 10:50:02 crc kubenswrapper[4990]: I1003 10:50:02.235612 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4jrtc"] Oct 03 10:50:02 crc kubenswrapper[4990]: I1003 10:50:02.881776 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" path="/var/lib/kubelet/pods/8c2c72f5-5150-4661-a205-5be4b412b7ae/volumes" Oct 03 10:51:25 crc kubenswrapper[4990]: I1003 10:51:25.303672 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:51:25 crc kubenswrapper[4990]: I1003 10:51:25.304242 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:51:55 crc kubenswrapper[4990]: I1003 10:51:55.304461 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:51:55 crc kubenswrapper[4990]: I1003 10:51:55.305186 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.023780 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gpd68"] Oct 03 10:52:01 crc kubenswrapper[4990]: E1003 10:52:01.024786 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerName="registry-server" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.024804 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerName="registry-server" Oct 03 10:52:01 crc kubenswrapper[4990]: E1003 10:52:01.024818 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerName="extract-content" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.024825 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerName="extract-content" Oct 03 10:52:01 crc kubenswrapper[4990]: E1003 10:52:01.024887 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerName="extract-utilities" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.024898 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerName="extract-utilities" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.025092 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2c72f5-5150-4661-a205-5be4b412b7ae" containerName="registry-server" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.026701 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.069644 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gpd68"] Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.131478 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbd5q\" (UniqueName: \"kubernetes.io/projected/9004dd59-efc6-4740-8290-aaec19564b0f-kube-api-access-nbd5q\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.131890 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-catalog-content\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.131924 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-utilities\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.233132 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-catalog-content\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.233186 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-utilities\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.233245 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbd5q\" (UniqueName: \"kubernetes.io/projected/9004dd59-efc6-4740-8290-aaec19564b0f-kube-api-access-nbd5q\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.233899 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-catalog-content\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.233916 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-utilities\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.252563 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbd5q\" (UniqueName: \"kubernetes.io/projected/9004dd59-efc6-4740-8290-aaec19564b0f-kube-api-access-nbd5q\") pod \"certified-operators-gpd68\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.361202 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:01 crc kubenswrapper[4990]: I1003 10:52:01.849955 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gpd68"] Oct 03 10:52:02 crc kubenswrapper[4990]: I1003 10:52:02.049419 4990 generic.go:334] "Generic (PLEG): container finished" podID="9004dd59-efc6-4740-8290-aaec19564b0f" containerID="3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e" exitCode=0 Oct 03 10:52:02 crc kubenswrapper[4990]: I1003 10:52:02.049464 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gpd68" event={"ID":"9004dd59-efc6-4740-8290-aaec19564b0f","Type":"ContainerDied","Data":"3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e"} Oct 03 10:52:02 crc kubenswrapper[4990]: I1003 10:52:02.049490 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gpd68" event={"ID":"9004dd59-efc6-4740-8290-aaec19564b0f","Type":"ContainerStarted","Data":"18f24127009604ec02659fd8533e7dbb71c3ba8aea5be9b80c19d329b430725e"} Oct 03 10:52:03 crc kubenswrapper[4990]: I1003 10:52:03.067383 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gpd68" event={"ID":"9004dd59-efc6-4740-8290-aaec19564b0f","Type":"ContainerStarted","Data":"fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6"} Oct 03 10:52:04 crc kubenswrapper[4990]: I1003 10:52:04.080996 4990 generic.go:334] "Generic (PLEG): container finished" podID="9004dd59-efc6-4740-8290-aaec19564b0f" containerID="fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6" exitCode=0 Oct 03 10:52:04 crc kubenswrapper[4990]: I1003 10:52:04.081054 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gpd68" event={"ID":"9004dd59-efc6-4740-8290-aaec19564b0f","Type":"ContainerDied","Data":"fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6"} Oct 03 10:52:05 crc kubenswrapper[4990]: I1003 10:52:05.090182 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gpd68" event={"ID":"9004dd59-efc6-4740-8290-aaec19564b0f","Type":"ContainerStarted","Data":"42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f"} Oct 03 10:52:05 crc kubenswrapper[4990]: I1003 10:52:05.111102 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gpd68" podStartSLOduration=1.6397412359999999 podStartE2EDuration="4.111077119s" podCreationTimestamp="2025-10-03 10:52:01 +0000 UTC" firstStartedPulling="2025-10-03 10:52:02.051471194 +0000 UTC m=+4103.848103051" lastFinishedPulling="2025-10-03 10:52:04.522807077 +0000 UTC m=+4106.319438934" observedRunningTime="2025-10-03 10:52:05.108581765 +0000 UTC m=+4106.905213632" watchObservedRunningTime="2025-10-03 10:52:05.111077119 +0000 UTC m=+4106.907708986" Oct 03 10:52:11 crc kubenswrapper[4990]: I1003 10:52:11.362261 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:11 crc kubenswrapper[4990]: I1003 10:52:11.363097 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:11 crc kubenswrapper[4990]: I1003 10:52:11.410292 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:12 crc kubenswrapper[4990]: I1003 10:52:12.187871 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:12 crc kubenswrapper[4990]: I1003 10:52:12.237866 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gpd68"] Oct 03 10:52:14 crc kubenswrapper[4990]: I1003 10:52:14.163472 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gpd68" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" containerName="registry-server" containerID="cri-o://42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f" gracePeriod=2 Oct 03 10:52:15 crc kubenswrapper[4990]: I1003 10:52:15.898400 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.057915 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbd5q\" (UniqueName: \"kubernetes.io/projected/9004dd59-efc6-4740-8290-aaec19564b0f-kube-api-access-nbd5q\") pod \"9004dd59-efc6-4740-8290-aaec19564b0f\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.058021 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-catalog-content\") pod \"9004dd59-efc6-4740-8290-aaec19564b0f\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.058062 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-utilities\") pod \"9004dd59-efc6-4740-8290-aaec19564b0f\" (UID: \"9004dd59-efc6-4740-8290-aaec19564b0f\") " Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.059042 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-utilities" (OuterVolumeSpecName: "utilities") pod "9004dd59-efc6-4740-8290-aaec19564b0f" (UID: "9004dd59-efc6-4740-8290-aaec19564b0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.064612 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9004dd59-efc6-4740-8290-aaec19564b0f-kube-api-access-nbd5q" (OuterVolumeSpecName: "kube-api-access-nbd5q") pod "9004dd59-efc6-4740-8290-aaec19564b0f" (UID: "9004dd59-efc6-4740-8290-aaec19564b0f"). InnerVolumeSpecName "kube-api-access-nbd5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.130649 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9004dd59-efc6-4740-8290-aaec19564b0f" (UID: "9004dd59-efc6-4740-8290-aaec19564b0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.159863 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.159914 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9004dd59-efc6-4740-8290-aaec19564b0f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.159928 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbd5q\" (UniqueName: \"kubernetes.io/projected/9004dd59-efc6-4740-8290-aaec19564b0f-kube-api-access-nbd5q\") on node \"crc\" DevicePath \"\"" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.181489 4990 generic.go:334] "Generic (PLEG): container finished" podID="9004dd59-efc6-4740-8290-aaec19564b0f" containerID="42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f" exitCode=0 Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.181588 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gpd68" event={"ID":"9004dd59-efc6-4740-8290-aaec19564b0f","Type":"ContainerDied","Data":"42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f"} Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.181618 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gpd68" event={"ID":"9004dd59-efc6-4740-8290-aaec19564b0f","Type":"ContainerDied","Data":"18f24127009604ec02659fd8533e7dbb71c3ba8aea5be9b80c19d329b430725e"} Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.181641 4990 scope.go:117] "RemoveContainer" containerID="42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.181776 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gpd68" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.217301 4990 scope.go:117] "RemoveContainer" containerID="fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.221624 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gpd68"] Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.227183 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gpd68"] Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.249759 4990 scope.go:117] "RemoveContainer" containerID="3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.265880 4990 scope.go:117] "RemoveContainer" containerID="42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f" Oct 03 10:52:16 crc kubenswrapper[4990]: E1003 10:52:16.266317 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f\": container with ID starting with 42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f not found: ID does not exist" containerID="42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.266354 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f"} err="failed to get container status \"42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f\": rpc error: code = NotFound desc = could not find container \"42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f\": container with ID starting with 42528ba12412055a7f57c2da393931f679735a16c898d2bab2cf78bffb9f138f not found: ID does not exist" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.266379 4990 scope.go:117] "RemoveContainer" containerID="fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6" Oct 03 10:52:16 crc kubenswrapper[4990]: E1003 10:52:16.266613 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6\": container with ID starting with fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6 not found: ID does not exist" containerID="fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.266641 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6"} err="failed to get container status \"fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6\": rpc error: code = NotFound desc = could not find container \"fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6\": container with ID starting with fb5123af24ac3e2eda5330f26e6a9bb04604dc457a21f65f1b56fc265d2bc5f6 not found: ID does not exist" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.266663 4990 scope.go:117] "RemoveContainer" containerID="3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e" Oct 03 10:52:16 crc kubenswrapper[4990]: E1003 10:52:16.266978 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e\": container with ID starting with 3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e not found: ID does not exist" containerID="3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.267009 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e"} err="failed to get container status \"3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e\": rpc error: code = NotFound desc = could not find container \"3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e\": container with ID starting with 3e165a5d85a7892695109fb0660e23688fed3f60abfc280eb78e56dd3492558e not found: ID does not exist" Oct 03 10:52:16 crc kubenswrapper[4990]: I1003 10:52:16.886819 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" path="/var/lib/kubelet/pods/9004dd59-efc6-4740-8290-aaec19564b0f/volumes" Oct 03 10:52:25 crc kubenswrapper[4990]: I1003 10:52:25.303839 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:52:25 crc kubenswrapper[4990]: I1003 10:52:25.304401 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:52:25 crc kubenswrapper[4990]: I1003 10:52:25.304459 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:52:25 crc kubenswrapper[4990]: I1003 10:52:25.305228 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a6ec41b995a2eb5e892eac05f8dddb37bc968bc5110dd7e700194d6288bc63a"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:52:25 crc kubenswrapper[4990]: I1003 10:52:25.305293 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://8a6ec41b995a2eb5e892eac05f8dddb37bc968bc5110dd7e700194d6288bc63a" gracePeriod=600 Oct 03 10:52:26 crc kubenswrapper[4990]: I1003 10:52:26.287694 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="8a6ec41b995a2eb5e892eac05f8dddb37bc968bc5110dd7e700194d6288bc63a" exitCode=0 Oct 03 10:52:26 crc kubenswrapper[4990]: I1003 10:52:26.287893 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"8a6ec41b995a2eb5e892eac05f8dddb37bc968bc5110dd7e700194d6288bc63a"} Oct 03 10:52:26 crc kubenswrapper[4990]: I1003 10:52:26.288313 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62"} Oct 03 10:52:26 crc kubenswrapper[4990]: I1003 10:52:26.288342 4990 scope.go:117] "RemoveContainer" containerID="2c1b1ccb31bde75fa2ce82e72aef521357ece5c9bc2fab8fca5d0cb77adcb4c4" Oct 03 10:54:25 crc kubenswrapper[4990]: I1003 10:54:25.303596 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:54:25 crc kubenswrapper[4990]: I1003 10:54:25.304430 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:54:55 crc kubenswrapper[4990]: I1003 10:54:55.303695 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:54:55 crc kubenswrapper[4990]: I1003 10:54:55.304554 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.462482 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gggrc"] Oct 03 10:55:07 crc kubenswrapper[4990]: E1003 10:55:07.463289 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" containerName="extract-utilities" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.463303 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" containerName="extract-utilities" Oct 03 10:55:07 crc kubenswrapper[4990]: E1003 10:55:07.463322 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" containerName="registry-server" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.463330 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" containerName="registry-server" Oct 03 10:55:07 crc kubenswrapper[4990]: E1003 10:55:07.463355 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" containerName="extract-content" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.463363 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" containerName="extract-content" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.463571 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9004dd59-efc6-4740-8290-aaec19564b0f" containerName="registry-server" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.466142 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.515004 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gggrc"] Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.640733 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-utilities\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.640947 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-catalog-content\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.641038 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5jg\" (UniqueName: \"kubernetes.io/projected/06b59e0d-36b6-42a0-af4d-7ad68a743175-kube-api-access-pw5jg\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.742476 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-utilities\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.742590 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-catalog-content\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.742630 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5jg\" (UniqueName: \"kubernetes.io/projected/06b59e0d-36b6-42a0-af4d-7ad68a743175-kube-api-access-pw5jg\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.742985 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-utilities\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.743168 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-catalog-content\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.765808 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5jg\" (UniqueName: \"kubernetes.io/projected/06b59e0d-36b6-42a0-af4d-7ad68a743175-kube-api-access-pw5jg\") pod \"community-operators-gggrc\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:07 crc kubenswrapper[4990]: I1003 10:55:07.810583 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:08 crc kubenswrapper[4990]: I1003 10:55:08.334007 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gggrc"] Oct 03 10:55:08 crc kubenswrapper[4990]: I1003 10:55:08.622078 4990 generic.go:334] "Generic (PLEG): container finished" podID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerID="34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d" exitCode=0 Oct 03 10:55:08 crc kubenswrapper[4990]: I1003 10:55:08.622123 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggrc" event={"ID":"06b59e0d-36b6-42a0-af4d-7ad68a743175","Type":"ContainerDied","Data":"34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d"} Oct 03 10:55:08 crc kubenswrapper[4990]: I1003 10:55:08.622153 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggrc" event={"ID":"06b59e0d-36b6-42a0-af4d-7ad68a743175","Type":"ContainerStarted","Data":"b6e7559eef19fc09b464ceeb64ef010615ab2e0e6c0386f4ac87440fbb7cd0ee"} Oct 03 10:55:08 crc kubenswrapper[4990]: I1003 10:55:08.624107 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:55:09 crc kubenswrapper[4990]: I1003 10:55:09.631830 4990 generic.go:334] "Generic (PLEG): container finished" podID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerID="65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5" exitCode=0 Oct 03 10:55:09 crc kubenswrapper[4990]: I1003 10:55:09.631892 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggrc" event={"ID":"06b59e0d-36b6-42a0-af4d-7ad68a743175","Type":"ContainerDied","Data":"65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5"} Oct 03 10:55:10 crc kubenswrapper[4990]: I1003 10:55:10.668940 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggrc" event={"ID":"06b59e0d-36b6-42a0-af4d-7ad68a743175","Type":"ContainerStarted","Data":"a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce"} Oct 03 10:55:10 crc kubenswrapper[4990]: I1003 10:55:10.698969 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gggrc" podStartSLOduration=2.024815077 podStartE2EDuration="3.69894841s" podCreationTimestamp="2025-10-03 10:55:07 +0000 UTC" firstStartedPulling="2025-10-03 10:55:08.623853321 +0000 UTC m=+4290.420485168" lastFinishedPulling="2025-10-03 10:55:10.297986614 +0000 UTC m=+4292.094618501" observedRunningTime="2025-10-03 10:55:10.693871008 +0000 UTC m=+4292.490502865" watchObservedRunningTime="2025-10-03 10:55:10.69894841 +0000 UTC m=+4292.495580287" Oct 03 10:55:17 crc kubenswrapper[4990]: I1003 10:55:17.811500 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:17 crc kubenswrapper[4990]: I1003 10:55:17.812420 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:17 crc kubenswrapper[4990]: I1003 10:55:17.873685 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:18 crc kubenswrapper[4990]: I1003 10:55:18.823983 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:18 crc kubenswrapper[4990]: I1003 10:55:18.887896 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gggrc"] Oct 03 10:55:20 crc kubenswrapper[4990]: I1003 10:55:20.769189 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gggrc" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerName="registry-server" containerID="cri-o://a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce" gracePeriod=2 Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.214408 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.348565 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-utilities\") pod \"06b59e0d-36b6-42a0-af4d-7ad68a743175\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.348670 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5jg\" (UniqueName: \"kubernetes.io/projected/06b59e0d-36b6-42a0-af4d-7ad68a743175-kube-api-access-pw5jg\") pod \"06b59e0d-36b6-42a0-af4d-7ad68a743175\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.348755 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-catalog-content\") pod \"06b59e0d-36b6-42a0-af4d-7ad68a743175\" (UID: \"06b59e0d-36b6-42a0-af4d-7ad68a743175\") " Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.349395 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-utilities" (OuterVolumeSpecName: "utilities") pod "06b59e0d-36b6-42a0-af4d-7ad68a743175" (UID: "06b59e0d-36b6-42a0-af4d-7ad68a743175"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.357731 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b59e0d-36b6-42a0-af4d-7ad68a743175-kube-api-access-pw5jg" (OuterVolumeSpecName: "kube-api-access-pw5jg") pod "06b59e0d-36b6-42a0-af4d-7ad68a743175" (UID: "06b59e0d-36b6-42a0-af4d-7ad68a743175"). InnerVolumeSpecName "kube-api-access-pw5jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.398820 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06b59e0d-36b6-42a0-af4d-7ad68a743175" (UID: "06b59e0d-36b6-42a0-af4d-7ad68a743175"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.450921 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.450971 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw5jg\" (UniqueName: \"kubernetes.io/projected/06b59e0d-36b6-42a0-af4d-7ad68a743175-kube-api-access-pw5jg\") on node \"crc\" DevicePath \"\"" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.450993 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b59e0d-36b6-42a0-af4d-7ad68a743175-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.779497 4990 generic.go:334] "Generic (PLEG): container finished" podID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerID="a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce" exitCode=0 Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.779575 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggrc" event={"ID":"06b59e0d-36b6-42a0-af4d-7ad68a743175","Type":"ContainerDied","Data":"a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce"} Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.779586 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gggrc" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.779613 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gggrc" event={"ID":"06b59e0d-36b6-42a0-af4d-7ad68a743175","Type":"ContainerDied","Data":"b6e7559eef19fc09b464ceeb64ef010615ab2e0e6c0386f4ac87440fbb7cd0ee"} Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.779644 4990 scope.go:117] "RemoveContainer" containerID="a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.823398 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gggrc"] Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.830360 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gggrc"] Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.832077 4990 scope.go:117] "RemoveContainer" containerID="65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.857988 4990 scope.go:117] "RemoveContainer" containerID="34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.881879 4990 scope.go:117] "RemoveContainer" containerID="a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce" Oct 03 10:55:21 crc kubenswrapper[4990]: E1003 10:55:21.882943 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce\": container with ID starting with a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce not found: ID does not exist" containerID="a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.882991 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce"} err="failed to get container status \"a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce\": rpc error: code = NotFound desc = could not find container \"a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce\": container with ID starting with a5baffd32093b824676414f812ca31250220aac46729c1d1388eb26fbb2782ce not found: ID does not exist" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.883017 4990 scope.go:117] "RemoveContainer" containerID="65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5" Oct 03 10:55:21 crc kubenswrapper[4990]: E1003 10:55:21.883811 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5\": container with ID starting with 65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5 not found: ID does not exist" containerID="65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.883856 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5"} err="failed to get container status \"65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5\": rpc error: code = NotFound desc = could not find container \"65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5\": container with ID starting with 65a19baacdc62a5108d7ea086312bba65ed7968906bd78f94f0f6157e4d654f5 not found: ID does not exist" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.883892 4990 scope.go:117] "RemoveContainer" containerID="34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d" Oct 03 10:55:21 crc kubenswrapper[4990]: E1003 10:55:21.884318 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d\": container with ID starting with 34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d not found: ID does not exist" containerID="34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d" Oct 03 10:55:21 crc kubenswrapper[4990]: I1003 10:55:21.884356 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d"} err="failed to get container status \"34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d\": rpc error: code = NotFound desc = could not find container \"34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d\": container with ID starting with 34f0d292c189daeeb871b9be9b39fccc629c523b04d91ce90e7d564c4ed78c3d not found: ID does not exist" Oct 03 10:55:22 crc kubenswrapper[4990]: I1003 10:55:22.883844 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" path="/var/lib/kubelet/pods/06b59e0d-36b6-42a0-af4d-7ad68a743175/volumes" Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.304161 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.304669 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.304746 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.305806 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.305915 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" gracePeriod=600 Oct 03 10:55:25 crc kubenswrapper[4990]: E1003 10:55:25.459484 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.822505 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" exitCode=0 Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.822628 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62"} Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.822751 4990 scope.go:117] "RemoveContainer" containerID="8a6ec41b995a2eb5e892eac05f8dddb37bc968bc5110dd7e700194d6288bc63a" Oct 03 10:55:25 crc kubenswrapper[4990]: I1003 10:55:25.823449 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:55:25 crc kubenswrapper[4990]: E1003 10:55:25.824034 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:55:38 crc kubenswrapper[4990]: I1003 10:55:38.882896 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:55:38 crc kubenswrapper[4990]: E1003 10:55:38.884135 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:55:54 crc kubenswrapper[4990]: I1003 10:55:54.872601 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:55:54 crc kubenswrapper[4990]: E1003 10:55:54.873419 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:56:05 crc kubenswrapper[4990]: I1003 10:56:05.872015 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:56:05 crc kubenswrapper[4990]: E1003 10:56:05.873098 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.132479 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bbjh2"] Oct 03 10:56:12 crc kubenswrapper[4990]: E1003 10:56:12.133543 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerName="extract-utilities" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.133565 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerName="extract-utilities" Oct 03 10:56:12 crc kubenswrapper[4990]: E1003 10:56:12.133595 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerName="extract-content" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.133606 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerName="extract-content" Oct 03 10:56:12 crc kubenswrapper[4990]: E1003 10:56:12.133635 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerName="registry-server" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.133645 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerName="registry-server" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.133871 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b59e0d-36b6-42a0-af4d-7ad68a743175" containerName="registry-server" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.136008 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.147754 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbjh2"] Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.257656 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-catalog-content\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.257704 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4kn5\" (UniqueName: \"kubernetes.io/projected/6dc09966-242b-45ea-91fb-5465360591ed-kube-api-access-f4kn5\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.257723 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-utilities\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.359557 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-catalog-content\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.359608 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4kn5\" (UniqueName: \"kubernetes.io/projected/6dc09966-242b-45ea-91fb-5465360591ed-kube-api-access-f4kn5\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.359629 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-utilities\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.360123 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-utilities\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.360233 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-catalog-content\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.407347 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4kn5\" (UniqueName: \"kubernetes.io/projected/6dc09966-242b-45ea-91fb-5465360591ed-kube-api-access-f4kn5\") pod \"redhat-marketplace-bbjh2\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.466571 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:12 crc kubenswrapper[4990]: I1003 10:56:12.933300 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbjh2"] Oct 03 10:56:13 crc kubenswrapper[4990]: I1003 10:56:13.217256 4990 generic.go:334] "Generic (PLEG): container finished" podID="6dc09966-242b-45ea-91fb-5465360591ed" containerID="3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0" exitCode=0 Oct 03 10:56:13 crc kubenswrapper[4990]: I1003 10:56:13.217367 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbjh2" event={"ID":"6dc09966-242b-45ea-91fb-5465360591ed","Type":"ContainerDied","Data":"3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0"} Oct 03 10:56:13 crc kubenswrapper[4990]: I1003 10:56:13.217618 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbjh2" event={"ID":"6dc09966-242b-45ea-91fb-5465360591ed","Type":"ContainerStarted","Data":"a7737604f16767b51a57b3782d236387284e098030d70c7d619811768e6480a9"} Oct 03 10:56:14 crc kubenswrapper[4990]: I1003 10:56:14.227497 4990 generic.go:334] "Generic (PLEG): container finished" podID="6dc09966-242b-45ea-91fb-5465360591ed" containerID="678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924" exitCode=0 Oct 03 10:56:14 crc kubenswrapper[4990]: I1003 10:56:14.227558 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbjh2" event={"ID":"6dc09966-242b-45ea-91fb-5465360591ed","Type":"ContainerDied","Data":"678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924"} Oct 03 10:56:15 crc kubenswrapper[4990]: I1003 10:56:15.238360 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbjh2" event={"ID":"6dc09966-242b-45ea-91fb-5465360591ed","Type":"ContainerStarted","Data":"d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e"} Oct 03 10:56:15 crc kubenswrapper[4990]: I1003 10:56:15.262327 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bbjh2" podStartSLOduration=1.605026251 podStartE2EDuration="3.262297809s" podCreationTimestamp="2025-10-03 10:56:12 +0000 UTC" firstStartedPulling="2025-10-03 10:56:13.219117158 +0000 UTC m=+4355.015749025" lastFinishedPulling="2025-10-03 10:56:14.876388716 +0000 UTC m=+4356.673020583" observedRunningTime="2025-10-03 10:56:15.254400315 +0000 UTC m=+4357.051032172" watchObservedRunningTime="2025-10-03 10:56:15.262297809 +0000 UTC m=+4357.058929696" Oct 03 10:56:17 crc kubenswrapper[4990]: I1003 10:56:17.871653 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:56:17 crc kubenswrapper[4990]: E1003 10:56:17.872275 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:56:22 crc kubenswrapper[4990]: I1003 10:56:22.467622 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:22 crc kubenswrapper[4990]: I1003 10:56:22.468062 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:22 crc kubenswrapper[4990]: I1003 10:56:22.515067 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:23 crc kubenswrapper[4990]: I1003 10:56:23.384305 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:23 crc kubenswrapper[4990]: I1003 10:56:23.454874 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbjh2"] Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.327651 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bbjh2" podUID="6dc09966-242b-45ea-91fb-5465360591ed" containerName="registry-server" containerID="cri-o://d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e" gracePeriod=2 Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.767426 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.862930 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-utilities\") pod \"6dc09966-242b-45ea-91fb-5465360591ed\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.863019 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4kn5\" (UniqueName: \"kubernetes.io/projected/6dc09966-242b-45ea-91fb-5465360591ed-kube-api-access-f4kn5\") pod \"6dc09966-242b-45ea-91fb-5465360591ed\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.863101 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-catalog-content\") pod \"6dc09966-242b-45ea-91fb-5465360591ed\" (UID: \"6dc09966-242b-45ea-91fb-5465360591ed\") " Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.863845 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-utilities" (OuterVolumeSpecName: "utilities") pod "6dc09966-242b-45ea-91fb-5465360591ed" (UID: "6dc09966-242b-45ea-91fb-5465360591ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.875299 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc09966-242b-45ea-91fb-5465360591ed-kube-api-access-f4kn5" (OuterVolumeSpecName: "kube-api-access-f4kn5") pod "6dc09966-242b-45ea-91fb-5465360591ed" (UID: "6dc09966-242b-45ea-91fb-5465360591ed"). InnerVolumeSpecName "kube-api-access-f4kn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.882625 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dc09966-242b-45ea-91fb-5465360591ed" (UID: "6dc09966-242b-45ea-91fb-5465360591ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.966049 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.966370 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4kn5\" (UniqueName: \"kubernetes.io/projected/6dc09966-242b-45ea-91fb-5465360591ed-kube-api-access-f4kn5\") on node \"crc\" DevicePath \"\"" Oct 03 10:56:25 crc kubenswrapper[4990]: I1003 10:56:25.966471 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dc09966-242b-45ea-91fb-5465360591ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.341360 4990 generic.go:334] "Generic (PLEG): container finished" podID="6dc09966-242b-45ea-91fb-5465360591ed" containerID="d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e" exitCode=0 Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.341421 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbjh2" event={"ID":"6dc09966-242b-45ea-91fb-5465360591ed","Type":"ContainerDied","Data":"d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e"} Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.341469 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bbjh2" event={"ID":"6dc09966-242b-45ea-91fb-5465360591ed","Type":"ContainerDied","Data":"a7737604f16767b51a57b3782d236387284e098030d70c7d619811768e6480a9"} Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.341492 4990 scope.go:117] "RemoveContainer" containerID="d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.341484 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bbjh2" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.375784 4990 scope.go:117] "RemoveContainer" containerID="678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.395177 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbjh2"] Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.410015 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bbjh2"] Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.434311 4990 scope.go:117] "RemoveContainer" containerID="3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.457726 4990 scope.go:117] "RemoveContainer" containerID="d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e" Oct 03 10:56:26 crc kubenswrapper[4990]: E1003 10:56:26.458280 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e\": container with ID starting with d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e not found: ID does not exist" containerID="d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.458342 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e"} err="failed to get container status \"d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e\": rpc error: code = NotFound desc = could not find container \"d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e\": container with ID starting with d2c280b98315e2abed9cc53e3fd007d9d79381aeff883bf1cf3f50b35d6dd64e not found: ID does not exist" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.458384 4990 scope.go:117] "RemoveContainer" containerID="678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924" Oct 03 10:56:26 crc kubenswrapper[4990]: E1003 10:56:26.458887 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924\": container with ID starting with 678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924 not found: ID does not exist" containerID="678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.459128 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924"} err="failed to get container status \"678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924\": rpc error: code = NotFound desc = could not find container \"678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924\": container with ID starting with 678e5a5ca43347116901a48b13abf76df176670a899e366e584776c87c567924 not found: ID does not exist" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.459392 4990 scope.go:117] "RemoveContainer" containerID="3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0" Oct 03 10:56:26 crc kubenswrapper[4990]: E1003 10:56:26.460091 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0\": container with ID starting with 3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0 not found: ID does not exist" containerID="3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.460128 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0"} err="failed to get container status \"3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0\": rpc error: code = NotFound desc = could not find container \"3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0\": container with ID starting with 3508344175d5ebf156af8b1c702bcdd1696c6a71545a2eb071c684be16511ba0 not found: ID does not exist" Oct 03 10:56:26 crc kubenswrapper[4990]: I1003 10:56:26.880646 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc09966-242b-45ea-91fb-5465360591ed" path="/var/lib/kubelet/pods/6dc09966-242b-45ea-91fb-5465360591ed/volumes" Oct 03 10:56:28 crc kubenswrapper[4990]: I1003 10:56:28.875398 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:56:28 crc kubenswrapper[4990]: E1003 10:56:28.875897 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:56:41 crc kubenswrapper[4990]: I1003 10:56:41.871753 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:56:41 crc kubenswrapper[4990]: E1003 10:56:41.872718 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:56:52 crc kubenswrapper[4990]: I1003 10:56:52.872844 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:56:52 crc kubenswrapper[4990]: E1003 10:56:52.874079 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:57:07 crc kubenswrapper[4990]: I1003 10:57:07.872324 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:57:07 crc kubenswrapper[4990]: E1003 10:57:07.873609 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:57:22 crc kubenswrapper[4990]: I1003 10:57:22.871644 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:57:22 crc kubenswrapper[4990]: E1003 10:57:22.872631 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:57:37 crc kubenswrapper[4990]: I1003 10:57:37.871993 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:57:37 crc kubenswrapper[4990]: E1003 10:57:37.872557 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:57:51 crc kubenswrapper[4990]: I1003 10:57:51.871455 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:57:51 crc kubenswrapper[4990]: E1003 10:57:51.872202 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:58:02 crc kubenswrapper[4990]: I1003 10:58:02.871987 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:58:02 crc kubenswrapper[4990]: E1003 10:58:02.872994 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:58:14 crc kubenswrapper[4990]: I1003 10:58:14.872564 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:58:14 crc kubenswrapper[4990]: E1003 10:58:14.874355 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:58:29 crc kubenswrapper[4990]: I1003 10:58:29.872484 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:58:29 crc kubenswrapper[4990]: E1003 10:58:29.873745 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:58:43 crc kubenswrapper[4990]: I1003 10:58:43.872299 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:58:43 crc kubenswrapper[4990]: E1003 10:58:43.873241 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:58:54 crc kubenswrapper[4990]: I1003 10:58:54.872153 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:58:54 crc kubenswrapper[4990]: E1003 10:58:54.873172 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:59:05 crc kubenswrapper[4990]: I1003 10:59:05.871384 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:59:05 crc kubenswrapper[4990]: E1003 10:59:05.872363 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:59:18 crc kubenswrapper[4990]: I1003 10:59:18.878948 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:59:18 crc kubenswrapper[4990]: E1003 10:59:18.879993 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:59:30 crc kubenswrapper[4990]: I1003 10:59:30.871739 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:59:30 crc kubenswrapper[4990]: E1003 10:59:30.872629 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:59:42 crc kubenswrapper[4990]: I1003 10:59:42.872928 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:59:42 crc kubenswrapper[4990]: E1003 10:59:42.874031 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 10:59:57 crc kubenswrapper[4990]: I1003 10:59:57.872008 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 10:59:57 crc kubenswrapper[4990]: E1003 10:59:57.873016 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.151696 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5"] Oct 03 11:00:00 crc kubenswrapper[4990]: E1003 11:00:00.152541 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc09966-242b-45ea-91fb-5465360591ed" containerName="extract-content" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.152555 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc09966-242b-45ea-91fb-5465360591ed" containerName="extract-content" Oct 03 11:00:00 crc kubenswrapper[4990]: E1003 11:00:00.152581 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc09966-242b-45ea-91fb-5465360591ed" containerName="extract-utilities" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.152590 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc09966-242b-45ea-91fb-5465360591ed" containerName="extract-utilities" Oct 03 11:00:00 crc kubenswrapper[4990]: E1003 11:00:00.152601 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc09966-242b-45ea-91fb-5465360591ed" containerName="registry-server" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.152606 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc09966-242b-45ea-91fb-5465360591ed" containerName="registry-server" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.152743 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc09966-242b-45ea-91fb-5465360591ed" containerName="registry-server" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.153272 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.156667 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.160102 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5"] Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.160923 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.270388 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6kl8\" (UniqueName: \"kubernetes.io/projected/86f70e51-844b-4117-84d9-ea9b8b10b65a-kube-api-access-x6kl8\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.270462 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86f70e51-844b-4117-84d9-ea9b8b10b65a-secret-volume\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.270549 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f70e51-844b-4117-84d9-ea9b8b10b65a-config-volume\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.372583 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f70e51-844b-4117-84d9-ea9b8b10b65a-config-volume\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.372826 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6kl8\" (UniqueName: \"kubernetes.io/projected/86f70e51-844b-4117-84d9-ea9b8b10b65a-kube-api-access-x6kl8\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.372889 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86f70e51-844b-4117-84d9-ea9b8b10b65a-secret-volume\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.375113 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f70e51-844b-4117-84d9-ea9b8b10b65a-config-volume\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.381409 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86f70e51-844b-4117-84d9-ea9b8b10b65a-secret-volume\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.403097 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6kl8\" (UniqueName: \"kubernetes.io/projected/86f70e51-844b-4117-84d9-ea9b8b10b65a-kube-api-access-x6kl8\") pod \"collect-profiles-29324820-gffx5\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.475641 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:00 crc kubenswrapper[4990]: I1003 11:00:00.868076 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5"] Oct 03 11:00:01 crc kubenswrapper[4990]: I1003 11:00:01.289647 4990 generic.go:334] "Generic (PLEG): container finished" podID="86f70e51-844b-4117-84d9-ea9b8b10b65a" containerID="b14ed541fe3d91784c6d616c02bcb786419793b57c7553c87ce6de5b5068282c" exitCode=0 Oct 03 11:00:01 crc kubenswrapper[4990]: I1003 11:00:01.289696 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" event={"ID":"86f70e51-844b-4117-84d9-ea9b8b10b65a","Type":"ContainerDied","Data":"b14ed541fe3d91784c6d616c02bcb786419793b57c7553c87ce6de5b5068282c"} Oct 03 11:00:01 crc kubenswrapper[4990]: I1003 11:00:01.289942 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" event={"ID":"86f70e51-844b-4117-84d9-ea9b8b10b65a","Type":"ContainerStarted","Data":"fbdee17918c3d42b8199b8f6c21a2f8c3cbc7e83d597d50f78c5baedf28bfda3"} Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.618580 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.808470 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86f70e51-844b-4117-84d9-ea9b8b10b65a-secret-volume\") pod \"86f70e51-844b-4117-84d9-ea9b8b10b65a\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.808666 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f70e51-844b-4117-84d9-ea9b8b10b65a-config-volume\") pod \"86f70e51-844b-4117-84d9-ea9b8b10b65a\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.809190 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86f70e51-844b-4117-84d9-ea9b8b10b65a-config-volume" (OuterVolumeSpecName: "config-volume") pod "86f70e51-844b-4117-84d9-ea9b8b10b65a" (UID: "86f70e51-844b-4117-84d9-ea9b8b10b65a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.809258 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6kl8\" (UniqueName: \"kubernetes.io/projected/86f70e51-844b-4117-84d9-ea9b8b10b65a-kube-api-access-x6kl8\") pod \"86f70e51-844b-4117-84d9-ea9b8b10b65a\" (UID: \"86f70e51-844b-4117-84d9-ea9b8b10b65a\") " Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.809829 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f70e51-844b-4117-84d9-ea9b8b10b65a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.813324 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f70e51-844b-4117-84d9-ea9b8b10b65a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "86f70e51-844b-4117-84d9-ea9b8b10b65a" (UID: "86f70e51-844b-4117-84d9-ea9b8b10b65a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.813415 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f70e51-844b-4117-84d9-ea9b8b10b65a-kube-api-access-x6kl8" (OuterVolumeSpecName: "kube-api-access-x6kl8") pod "86f70e51-844b-4117-84d9-ea9b8b10b65a" (UID: "86f70e51-844b-4117-84d9-ea9b8b10b65a"). InnerVolumeSpecName "kube-api-access-x6kl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.911534 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86f70e51-844b-4117-84d9-ea9b8b10b65a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 11:00:02 crc kubenswrapper[4990]: I1003 11:00:02.911559 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6kl8\" (UniqueName: \"kubernetes.io/projected/86f70e51-844b-4117-84d9-ea9b8b10b65a-kube-api-access-x6kl8\") on node \"crc\" DevicePath \"\"" Oct 03 11:00:03 crc kubenswrapper[4990]: I1003 11:00:03.309468 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" event={"ID":"86f70e51-844b-4117-84d9-ea9b8b10b65a","Type":"ContainerDied","Data":"fbdee17918c3d42b8199b8f6c21a2f8c3cbc7e83d597d50f78c5baedf28bfda3"} Oct 03 11:00:03 crc kubenswrapper[4990]: I1003 11:00:03.309503 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbdee17918c3d42b8199b8f6c21a2f8c3cbc7e83d597d50f78c5baedf28bfda3" Oct 03 11:00:03 crc kubenswrapper[4990]: I1003 11:00:03.309575 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5" Oct 03 11:00:03 crc kubenswrapper[4990]: I1003 11:00:03.718983 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d"] Oct 03 11:00:03 crc kubenswrapper[4990]: I1003 11:00:03.730879 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324775-95p6d"] Oct 03 11:00:04 crc kubenswrapper[4990]: I1003 11:00:04.883179 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b19cc0d-3846-4242-8991-fbe787cc9680" path="/var/lib/kubelet/pods/8b19cc0d-3846-4242-8991-fbe787cc9680/volumes" Oct 03 11:00:10 crc kubenswrapper[4990]: I1003 11:00:10.872227 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 11:00:10 crc kubenswrapper[4990]: E1003 11:00:10.873180 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:00:22 crc kubenswrapper[4990]: I1003 11:00:22.872853 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 11:00:22 crc kubenswrapper[4990]: E1003 11:00:22.873843 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:00:33 crc kubenswrapper[4990]: I1003 11:00:33.872078 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 11:00:34 crc kubenswrapper[4990]: I1003 11:00:34.577309 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"1e900d340694debff290495e97d4f0b3946a050657962c4ba637a78dc937b8c1"} Oct 03 11:00:47 crc kubenswrapper[4990]: I1003 11:00:47.311356 4990 scope.go:117] "RemoveContainer" containerID="1d0967b9b6e7aaf644eba392f856bc81d0cb03bfb5c16e1b642676daaec05290" Oct 03 11:02:55 crc kubenswrapper[4990]: I1003 11:02:55.304305 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:02:55 crc kubenswrapper[4990]: I1003 11:02:55.305005 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:03:25 crc kubenswrapper[4990]: I1003 11:03:25.303880 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:03:25 crc kubenswrapper[4990]: I1003 11:03:25.306028 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.073084 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-hbzxs"] Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.084223 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-hbzxs"] Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.244203 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-p2ncs"] Oct 03 11:03:46 crc kubenswrapper[4990]: E1003 11:03:46.244573 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f70e51-844b-4117-84d9-ea9b8b10b65a" containerName="collect-profiles" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.244595 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f70e51-844b-4117-84d9-ea9b8b10b65a" containerName="collect-profiles" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.244793 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f70e51-844b-4117-84d9-ea9b8b10b65a" containerName="collect-profiles" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.245370 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.248485 4990 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pfc72" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.248562 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.249767 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p2ncs"] Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.250092 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.250891 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.324324 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps292\" (UniqueName: \"kubernetes.io/projected/073843b0-7fe8-4552-9002-713520fd507b-kube-api-access-ps292\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.324676 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/073843b0-7fe8-4552-9002-713520fd507b-node-mnt\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.324729 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/073843b0-7fe8-4552-9002-713520fd507b-crc-storage\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.425368 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps292\" (UniqueName: \"kubernetes.io/projected/073843b0-7fe8-4552-9002-713520fd507b-kube-api-access-ps292\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.425419 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/073843b0-7fe8-4552-9002-713520fd507b-node-mnt\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.425460 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/073843b0-7fe8-4552-9002-713520fd507b-crc-storage\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.425892 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/073843b0-7fe8-4552-9002-713520fd507b-node-mnt\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.426198 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/073843b0-7fe8-4552-9002-713520fd507b-crc-storage\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.870072 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps292\" (UniqueName: \"kubernetes.io/projected/073843b0-7fe8-4552-9002-713520fd507b-kube-api-access-ps292\") pod \"crc-storage-crc-p2ncs\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:46 crc kubenswrapper[4990]: I1003 11:03:46.889406 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e729a46-4a98-46a6-bc49-d6f73014b3b5" path="/var/lib/kubelet/pods/7e729a46-4a98-46a6-bc49-d6f73014b3b5/volumes" Oct 03 11:03:47 crc kubenswrapper[4990]: I1003 11:03:47.163213 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:47 crc kubenswrapper[4990]: I1003 11:03:47.416468 4990 scope.go:117] "RemoveContainer" containerID="0526b31e28e3b59b231194eb871d1d2d5943d70d62c4984c4c718c80c4aa3110" Oct 03 11:03:47 crc kubenswrapper[4990]: I1003 11:03:47.576858 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p2ncs"] Oct 03 11:03:47 crc kubenswrapper[4990]: I1003 11:03:47.587190 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 11:03:48 crc kubenswrapper[4990]: I1003 11:03:48.368822 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2ncs" event={"ID":"073843b0-7fe8-4552-9002-713520fd507b","Type":"ContainerStarted","Data":"77390e486cd11435c49886bcbdb10706a8d04cc11b67d5c425785e23b06ede9e"} Oct 03 11:03:48 crc kubenswrapper[4990]: I1003 11:03:48.369178 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2ncs" event={"ID":"073843b0-7fe8-4552-9002-713520fd507b","Type":"ContainerStarted","Data":"e9ad52f212740dc4b226c0afc247fc35c097c23b83b93493088eabe79ef93f3d"} Oct 03 11:03:48 crc kubenswrapper[4990]: I1003 11:03:48.387002 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-p2ncs" podStartSLOduration=1.843063348 podStartE2EDuration="2.386980773s" podCreationTimestamp="2025-10-03 11:03:46 +0000 UTC" firstStartedPulling="2025-10-03 11:03:47.586852717 +0000 UTC m=+4809.383484584" lastFinishedPulling="2025-10-03 11:03:48.130770132 +0000 UTC m=+4809.927402009" observedRunningTime="2025-10-03 11:03:48.386130391 +0000 UTC m=+4810.182762248" watchObservedRunningTime="2025-10-03 11:03:48.386980773 +0000 UTC m=+4810.183612630" Oct 03 11:03:49 crc kubenswrapper[4990]: I1003 11:03:49.379464 4990 generic.go:334] "Generic (PLEG): container finished" podID="073843b0-7fe8-4552-9002-713520fd507b" containerID="77390e486cd11435c49886bcbdb10706a8d04cc11b67d5c425785e23b06ede9e" exitCode=0 Oct 03 11:03:49 crc kubenswrapper[4990]: I1003 11:03:49.379561 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2ncs" event={"ID":"073843b0-7fe8-4552-9002-713520fd507b","Type":"ContainerDied","Data":"77390e486cd11435c49886bcbdb10706a8d04cc11b67d5c425785e23b06ede9e"} Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.807061 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.889986 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/073843b0-7fe8-4552-9002-713520fd507b-node-mnt\") pod \"073843b0-7fe8-4552-9002-713520fd507b\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.890096 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps292\" (UniqueName: \"kubernetes.io/projected/073843b0-7fe8-4552-9002-713520fd507b-kube-api-access-ps292\") pod \"073843b0-7fe8-4552-9002-713520fd507b\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.890121 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/073843b0-7fe8-4552-9002-713520fd507b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "073843b0-7fe8-4552-9002-713520fd507b" (UID: "073843b0-7fe8-4552-9002-713520fd507b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.890176 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/073843b0-7fe8-4552-9002-713520fd507b-crc-storage\") pod \"073843b0-7fe8-4552-9002-713520fd507b\" (UID: \"073843b0-7fe8-4552-9002-713520fd507b\") " Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.890500 4990 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/073843b0-7fe8-4552-9002-713520fd507b-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.898641 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073843b0-7fe8-4552-9002-713520fd507b-kube-api-access-ps292" (OuterVolumeSpecName: "kube-api-access-ps292") pod "073843b0-7fe8-4552-9002-713520fd507b" (UID: "073843b0-7fe8-4552-9002-713520fd507b"). InnerVolumeSpecName "kube-api-access-ps292". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.909748 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073843b0-7fe8-4552-9002-713520fd507b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "073843b0-7fe8-4552-9002-713520fd507b" (UID: "073843b0-7fe8-4552-9002-713520fd507b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.991264 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps292\" (UniqueName: \"kubernetes.io/projected/073843b0-7fe8-4552-9002-713520fd507b-kube-api-access-ps292\") on node \"crc\" DevicePath \"\"" Oct 03 11:03:50 crc kubenswrapper[4990]: I1003 11:03:50.991289 4990 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/073843b0-7fe8-4552-9002-713520fd507b-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 11:03:51 crc kubenswrapper[4990]: I1003 11:03:51.400323 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2ncs" event={"ID":"073843b0-7fe8-4552-9002-713520fd507b","Type":"ContainerDied","Data":"e9ad52f212740dc4b226c0afc247fc35c097c23b83b93493088eabe79ef93f3d"} Oct 03 11:03:51 crc kubenswrapper[4990]: I1003 11:03:51.400698 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9ad52f212740dc4b226c0afc247fc35c097c23b83b93493088eabe79ef93f3d" Oct 03 11:03:51 crc kubenswrapper[4990]: I1003 11:03:51.400840 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2ncs" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.158989 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-p2ncs"] Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.165632 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-p2ncs"] Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.337649 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-642x2"] Oct 03 11:03:53 crc kubenswrapper[4990]: E1003 11:03:53.338073 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073843b0-7fe8-4552-9002-713520fd507b" containerName="storage" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.338103 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="073843b0-7fe8-4552-9002-713520fd507b" containerName="storage" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.338363 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="073843b0-7fe8-4552-9002-713520fd507b" containerName="storage" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.339099 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.343749 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.344567 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.345269 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.345581 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-642x2"] Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.345929 4990 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pfc72" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.451005 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pvqz\" (UniqueName: \"kubernetes.io/projected/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-kube-api-access-4pvqz\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.451044 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-crc-storage\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.451066 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-node-mnt\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.552637 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pvqz\" (UniqueName: \"kubernetes.io/projected/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-kube-api-access-4pvqz\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.552689 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-crc-storage\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.552714 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-node-mnt\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.553117 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-node-mnt\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.554235 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-crc-storage\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.572457 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pvqz\" (UniqueName: \"kubernetes.io/projected/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-kube-api-access-4pvqz\") pod \"crc-storage-crc-642x2\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.663537 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:53 crc kubenswrapper[4990]: I1003 11:03:53.985928 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-642x2"] Oct 03 11:03:54 crc kubenswrapper[4990]: I1003 11:03:54.465322 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-642x2" event={"ID":"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616","Type":"ContainerStarted","Data":"dab12aa1557fa7446161ece8a41f247f24d5d556f3688ba39a78db97c98eb6f0"} Oct 03 11:03:54 crc kubenswrapper[4990]: I1003 11:03:54.881470 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073843b0-7fe8-4552-9002-713520fd507b" path="/var/lib/kubelet/pods/073843b0-7fe8-4552-9002-713520fd507b/volumes" Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.303849 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.304256 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.304315 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.305093 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e900d340694debff290495e97d4f0b3946a050657962c4ba637a78dc937b8c1"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.305186 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://1e900d340694debff290495e97d4f0b3946a050657962c4ba637a78dc937b8c1" gracePeriod=600 Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.474645 4990 generic.go:334] "Generic (PLEG): container finished" podID="4d5ad9f5-34af-4b7d-92d1-eaab32e2a616" containerID="ee3371cf5ddb09c1048c7bf65f1e67c84a3fa5b235c78415763bcaed05d76b1c" exitCode=0 Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.475211 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-642x2" event={"ID":"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616","Type":"ContainerDied","Data":"ee3371cf5ddb09c1048c7bf65f1e67c84a3fa5b235c78415763bcaed05d76b1c"} Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.478070 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="1e900d340694debff290495e97d4f0b3946a050657962c4ba637a78dc937b8c1" exitCode=0 Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.478150 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"1e900d340694debff290495e97d4f0b3946a050657962c4ba637a78dc937b8c1"} Oct 03 11:03:55 crc kubenswrapper[4990]: I1003 11:03:55.478260 4990 scope.go:117] "RemoveContainer" containerID="94892a0fda6a2e3c95b3229e784748d5869d545a09fe0a6cbd5e5fc537e99c62" Oct 03 11:03:56 crc kubenswrapper[4990]: I1003 11:03:56.490851 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7"} Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.239267 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.306779 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pvqz\" (UniqueName: \"kubernetes.io/projected/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-kube-api-access-4pvqz\") pod \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.306824 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-crc-storage\") pod \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.306869 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-node-mnt\") pod \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\" (UID: \"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616\") " Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.307074 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "4d5ad9f5-34af-4b7d-92d1-eaab32e2a616" (UID: "4d5ad9f5-34af-4b7d-92d1-eaab32e2a616"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.307188 4990 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.325987 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-kube-api-access-4pvqz" (OuterVolumeSpecName: "kube-api-access-4pvqz") pod "4d5ad9f5-34af-4b7d-92d1-eaab32e2a616" (UID: "4d5ad9f5-34af-4b7d-92d1-eaab32e2a616"). InnerVolumeSpecName "kube-api-access-4pvqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.326825 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "4d5ad9f5-34af-4b7d-92d1-eaab32e2a616" (UID: "4d5ad9f5-34af-4b7d-92d1-eaab32e2a616"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.408277 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pvqz\" (UniqueName: \"kubernetes.io/projected/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-kube-api-access-4pvqz\") on node \"crc\" DevicePath \"\"" Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.408336 4990 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4d5ad9f5-34af-4b7d-92d1-eaab32e2a616-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.499385 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-642x2" event={"ID":"4d5ad9f5-34af-4b7d-92d1-eaab32e2a616","Type":"ContainerDied","Data":"dab12aa1557fa7446161ece8a41f247f24d5d556f3688ba39a78db97c98eb6f0"} Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.499451 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dab12aa1557fa7446161ece8a41f247f24d5d556f3688ba39a78db97c98eb6f0" Oct 03 11:03:57 crc kubenswrapper[4990]: I1003 11:03:57.499400 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-642x2" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.157029 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hr4db"] Oct 03 11:05:14 crc kubenswrapper[4990]: E1003 11:05:14.159371 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5ad9f5-34af-4b7d-92d1-eaab32e2a616" containerName="storage" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.159462 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5ad9f5-34af-4b7d-92d1-eaab32e2a616" containerName="storage" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.159695 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5ad9f5-34af-4b7d-92d1-eaab32e2a616" containerName="storage" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.160919 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.174046 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hr4db"] Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.352107 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjxtm\" (UniqueName: \"kubernetes.io/projected/25153d7d-a5bc-4a2f-971c-06db24ed01a2-kube-api-access-cjxtm\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.352156 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-utilities\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.352261 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-catalog-content\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.453667 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxtm\" (UniqueName: \"kubernetes.io/projected/25153d7d-a5bc-4a2f-971c-06db24ed01a2-kube-api-access-cjxtm\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.453717 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-utilities\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.453768 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-catalog-content\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.454331 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-catalog-content\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.454682 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-utilities\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.482226 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjxtm\" (UniqueName: \"kubernetes.io/projected/25153d7d-a5bc-4a2f-971c-06db24ed01a2-kube-api-access-cjxtm\") pod \"certified-operators-hr4db\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.504806 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:14 crc kubenswrapper[4990]: I1003 11:05:14.796068 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hr4db"] Oct 03 11:05:15 crc kubenswrapper[4990]: I1003 11:05:15.297260 4990 generic.go:334] "Generic (PLEG): container finished" podID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerID="835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56" exitCode=0 Oct 03 11:05:15 crc kubenswrapper[4990]: I1003 11:05:15.297326 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hr4db" event={"ID":"25153d7d-a5bc-4a2f-971c-06db24ed01a2","Type":"ContainerDied","Data":"835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56"} Oct 03 11:05:15 crc kubenswrapper[4990]: I1003 11:05:15.297748 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hr4db" event={"ID":"25153d7d-a5bc-4a2f-971c-06db24ed01a2","Type":"ContainerStarted","Data":"1345a93280d829f0d9bccf738212cd2d8997a4d745c6a71d0e2a8cccf4b33536"} Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.312227 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hr4db" event={"ID":"25153d7d-a5bc-4a2f-971c-06db24ed01a2","Type":"ContainerStarted","Data":"c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705"} Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.546871 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lhf57"] Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.548304 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.563091 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lhf57"] Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.686867 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-catalog-content\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.686931 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2qwv\" (UniqueName: \"kubernetes.io/projected/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-kube-api-access-s2qwv\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.686988 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-utilities\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.743642 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gxxdb"] Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.745339 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.759131 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxxdb"] Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.810050 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-utilities\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.810219 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-catalog-content\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.810272 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2qwv\" (UniqueName: \"kubernetes.io/projected/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-kube-api-access-s2qwv\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.811361 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-utilities\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.811415 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-catalog-content\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.848026 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2qwv\" (UniqueName: \"kubernetes.io/projected/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-kube-api-access-s2qwv\") pod \"community-operators-lhf57\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.876889 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.911674 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-utilities\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.911716 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-catalog-content\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:16 crc kubenswrapper[4990]: I1003 11:05:16.911763 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7kqg\" (UniqueName: \"kubernetes.io/projected/2fa102f6-408c-429c-abd0-b8a38d44ca80-kube-api-access-q7kqg\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.013358 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7kqg\" (UniqueName: \"kubernetes.io/projected/2fa102f6-408c-429c-abd0-b8a38d44ca80-kube-api-access-q7kqg\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.013712 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-utilities\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.013740 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-catalog-content\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.014158 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-catalog-content\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.014227 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-utilities\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.032778 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7kqg\" (UniqueName: \"kubernetes.io/projected/2fa102f6-408c-429c-abd0-b8a38d44ca80-kube-api-access-q7kqg\") pod \"redhat-operators-gxxdb\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.113315 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.292620 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lhf57"] Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.324000 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhf57" event={"ID":"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8","Type":"ContainerStarted","Data":"314e7ae5e8c318b406a27018e35ca95ea3515be2ada0b9635693d16088ff1376"} Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.329491 4990 generic.go:334] "Generic (PLEG): container finished" podID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerID="c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705" exitCode=0 Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.329586 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hr4db" event={"ID":"25153d7d-a5bc-4a2f-971c-06db24ed01a2","Type":"ContainerDied","Data":"c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705"} Oct 03 11:05:17 crc kubenswrapper[4990]: I1003 11:05:17.516900 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxxdb"] Oct 03 11:05:18 crc kubenswrapper[4990]: I1003 11:05:18.337733 4990 generic.go:334] "Generic (PLEG): container finished" podID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerID="6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f" exitCode=0 Oct 03 11:05:18 crc kubenswrapper[4990]: I1003 11:05:18.337800 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxxdb" event={"ID":"2fa102f6-408c-429c-abd0-b8a38d44ca80","Type":"ContainerDied","Data":"6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f"} Oct 03 11:05:18 crc kubenswrapper[4990]: I1003 11:05:18.337824 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxxdb" event={"ID":"2fa102f6-408c-429c-abd0-b8a38d44ca80","Type":"ContainerStarted","Data":"e21cadfc96717a9e9dbc32beecb7db9c3c856833d3d1a54fe0b74c71eb637afe"} Oct 03 11:05:18 crc kubenswrapper[4990]: I1003 11:05:18.342974 4990 generic.go:334] "Generic (PLEG): container finished" podID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerID="4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6" exitCode=0 Oct 03 11:05:18 crc kubenswrapper[4990]: I1003 11:05:18.343055 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhf57" event={"ID":"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8","Type":"ContainerDied","Data":"4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6"} Oct 03 11:05:18 crc kubenswrapper[4990]: I1003 11:05:18.346162 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hr4db" event={"ID":"25153d7d-a5bc-4a2f-971c-06db24ed01a2","Type":"ContainerStarted","Data":"37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e"} Oct 03 11:05:19 crc kubenswrapper[4990]: I1003 11:05:19.357236 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxxdb" event={"ID":"2fa102f6-408c-429c-abd0-b8a38d44ca80","Type":"ContainerStarted","Data":"26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b"} Oct 03 11:05:19 crc kubenswrapper[4990]: I1003 11:05:19.378671 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hr4db" podStartSLOduration=2.866572241 podStartE2EDuration="5.378647305s" podCreationTimestamp="2025-10-03 11:05:14 +0000 UTC" firstStartedPulling="2025-10-03 11:05:15.299663138 +0000 UTC m=+4897.096295005" lastFinishedPulling="2025-10-03 11:05:17.811738212 +0000 UTC m=+4899.608370069" observedRunningTime="2025-10-03 11:05:18.402497281 +0000 UTC m=+4900.199129148" watchObservedRunningTime="2025-10-03 11:05:19.378647305 +0000 UTC m=+4901.175279152" Oct 03 11:05:20 crc kubenswrapper[4990]: I1003 11:05:20.369871 4990 generic.go:334] "Generic (PLEG): container finished" podID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerID="26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b" exitCode=0 Oct 03 11:05:20 crc kubenswrapper[4990]: I1003 11:05:20.369975 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxxdb" event={"ID":"2fa102f6-408c-429c-abd0-b8a38d44ca80","Type":"ContainerDied","Data":"26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b"} Oct 03 11:05:20 crc kubenswrapper[4990]: I1003 11:05:20.375010 4990 generic.go:334] "Generic (PLEG): container finished" podID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerID="15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1" exitCode=0 Oct 03 11:05:20 crc kubenswrapper[4990]: I1003 11:05:20.375066 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhf57" event={"ID":"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8","Type":"ContainerDied","Data":"15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1"} Oct 03 11:05:21 crc kubenswrapper[4990]: I1003 11:05:21.384067 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhf57" event={"ID":"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8","Type":"ContainerStarted","Data":"845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01"} Oct 03 11:05:21 crc kubenswrapper[4990]: I1003 11:05:21.388069 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxxdb" event={"ID":"2fa102f6-408c-429c-abd0-b8a38d44ca80","Type":"ContainerStarted","Data":"fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be"} Oct 03 11:05:21 crc kubenswrapper[4990]: I1003 11:05:21.401620 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lhf57" podStartSLOduration=2.921668755 podStartE2EDuration="5.401601676s" podCreationTimestamp="2025-10-03 11:05:16 +0000 UTC" firstStartedPulling="2025-10-03 11:05:18.344254658 +0000 UTC m=+4900.140886515" lastFinishedPulling="2025-10-03 11:05:20.824187539 +0000 UTC m=+4902.620819436" observedRunningTime="2025-10-03 11:05:21.400858817 +0000 UTC m=+4903.197490744" watchObservedRunningTime="2025-10-03 11:05:21.401601676 +0000 UTC m=+4903.198233543" Oct 03 11:05:21 crc kubenswrapper[4990]: I1003 11:05:21.427723 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gxxdb" podStartSLOduration=2.95776243 podStartE2EDuration="5.427703425s" podCreationTimestamp="2025-10-03 11:05:16 +0000 UTC" firstStartedPulling="2025-10-03 11:05:18.339962938 +0000 UTC m=+4900.136594795" lastFinishedPulling="2025-10-03 11:05:20.809903903 +0000 UTC m=+4902.606535790" observedRunningTime="2025-10-03 11:05:21.422376148 +0000 UTC m=+4903.219008015" watchObservedRunningTime="2025-10-03 11:05:21.427703425 +0000 UTC m=+4903.224335292" Oct 03 11:05:24 crc kubenswrapper[4990]: I1003 11:05:24.505150 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:24 crc kubenswrapper[4990]: I1003 11:05:24.505596 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:24 crc kubenswrapper[4990]: I1003 11:05:24.574555 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:25 crc kubenswrapper[4990]: I1003 11:05:25.505201 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:25 crc kubenswrapper[4990]: I1003 11:05:25.939273 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hr4db"] Oct 03 11:05:26 crc kubenswrapper[4990]: I1003 11:05:26.888726 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:26 crc kubenswrapper[4990]: I1003 11:05:26.888838 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:26 crc kubenswrapper[4990]: I1003 11:05:26.963288 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:27 crc kubenswrapper[4990]: I1003 11:05:27.114766 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:27 crc kubenswrapper[4990]: I1003 11:05:27.115025 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:27 crc kubenswrapper[4990]: I1003 11:05:27.188162 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:27 crc kubenswrapper[4990]: I1003 11:05:27.451718 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hr4db" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerName="registry-server" containerID="cri-o://37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e" gracePeriod=2 Oct 03 11:05:27 crc kubenswrapper[4990]: I1003 11:05:27.519479 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:27 crc kubenswrapper[4990]: I1003 11:05:27.521614 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:27 crc kubenswrapper[4990]: I1003 11:05:27.901690 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.094433 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-catalog-content\") pod \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.094594 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjxtm\" (UniqueName: \"kubernetes.io/projected/25153d7d-a5bc-4a2f-971c-06db24ed01a2-kube-api-access-cjxtm\") pod \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.094821 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-utilities\") pod \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\" (UID: \"25153d7d-a5bc-4a2f-971c-06db24ed01a2\") " Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.096117 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-utilities" (OuterVolumeSpecName: "utilities") pod "25153d7d-a5bc-4a2f-971c-06db24ed01a2" (UID: "25153d7d-a5bc-4a2f-971c-06db24ed01a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.097103 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.100868 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25153d7d-a5bc-4a2f-971c-06db24ed01a2-kube-api-access-cjxtm" (OuterVolumeSpecName: "kube-api-access-cjxtm") pod "25153d7d-a5bc-4a2f-971c-06db24ed01a2" (UID: "25153d7d-a5bc-4a2f-971c-06db24ed01a2"). InnerVolumeSpecName "kube-api-access-cjxtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.149348 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25153d7d-a5bc-4a2f-971c-06db24ed01a2" (UID: "25153d7d-a5bc-4a2f-971c-06db24ed01a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.197901 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25153d7d-a5bc-4a2f-971c-06db24ed01a2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.197938 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjxtm\" (UniqueName: \"kubernetes.io/projected/25153d7d-a5bc-4a2f-971c-06db24ed01a2-kube-api-access-cjxtm\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.461413 4990 generic.go:334] "Generic (PLEG): container finished" podID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerID="37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e" exitCode=0 Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.461586 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hr4db" event={"ID":"25153d7d-a5bc-4a2f-971c-06db24ed01a2","Type":"ContainerDied","Data":"37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e"} Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.461612 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hr4db" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.461763 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hr4db" event={"ID":"25153d7d-a5bc-4a2f-971c-06db24ed01a2","Type":"ContainerDied","Data":"1345a93280d829f0d9bccf738212cd2d8997a4d745c6a71d0e2a8cccf4b33536"} Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.461814 4990 scope.go:117] "RemoveContainer" containerID="37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.491878 4990 scope.go:117] "RemoveContainer" containerID="c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.496752 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hr4db"] Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.501309 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hr4db"] Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.515116 4990 scope.go:117] "RemoveContainer" containerID="835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.559160 4990 scope.go:117] "RemoveContainer" containerID="37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e" Oct 03 11:05:28 crc kubenswrapper[4990]: E1003 11:05:28.559895 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e\": container with ID starting with 37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e not found: ID does not exist" containerID="37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.559937 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e"} err="failed to get container status \"37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e\": rpc error: code = NotFound desc = could not find container \"37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e\": container with ID starting with 37b4918614e43f5eae99dc5f7f4ff914dc39997401d30dd461b8156a31750b6e not found: ID does not exist" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.559961 4990 scope.go:117] "RemoveContainer" containerID="c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705" Oct 03 11:05:28 crc kubenswrapper[4990]: E1003 11:05:28.560586 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705\": container with ID starting with c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705 not found: ID does not exist" containerID="c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.560642 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705"} err="failed to get container status \"c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705\": rpc error: code = NotFound desc = could not find container \"c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705\": container with ID starting with c317214920fc9b20cf5295ed87175756896ec3fd925a69f78d5d34441bff4705 not found: ID does not exist" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.560685 4990 scope.go:117] "RemoveContainer" containerID="835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56" Oct 03 11:05:28 crc kubenswrapper[4990]: E1003 11:05:28.561015 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56\": container with ID starting with 835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56 not found: ID does not exist" containerID="835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.561045 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56"} err="failed to get container status \"835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56\": rpc error: code = NotFound desc = could not find container \"835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56\": container with ID starting with 835f01b1663949ebc8eabb34eca3a0333380cc5add122f307c74abab7a5beb56 not found: ID does not exist" Oct 03 11:05:28 crc kubenswrapper[4990]: I1003 11:05:28.884499 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" path="/var/lib/kubelet/pods/25153d7d-a5bc-4a2f-971c-06db24ed01a2/volumes" Oct 03 11:05:29 crc kubenswrapper[4990]: I1003 11:05:29.339333 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lhf57"] Oct 03 11:05:29 crc kubenswrapper[4990]: I1003 11:05:29.471522 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lhf57" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerName="registry-server" containerID="cri-o://845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01" gracePeriod=2 Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.002857 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.127069 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-utilities\") pod \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.127145 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2qwv\" (UniqueName: \"kubernetes.io/projected/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-kube-api-access-s2qwv\") pod \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.127181 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-catalog-content\") pod \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\" (UID: \"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8\") " Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.128800 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-utilities" (OuterVolumeSpecName: "utilities") pod "9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" (UID: "9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.134992 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-kube-api-access-s2qwv" (OuterVolumeSpecName: "kube-api-access-s2qwv") pod "9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" (UID: "9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8"). InnerVolumeSpecName "kube-api-access-s2qwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.174318 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" (UID: "9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.229244 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.229282 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2qwv\" (UniqueName: \"kubernetes.io/projected/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-kube-api-access-s2qwv\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.229292 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.483138 4990 generic.go:334] "Generic (PLEG): container finished" podID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerID="845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01" exitCode=0 Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.483179 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhf57" event={"ID":"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8","Type":"ContainerDied","Data":"845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01"} Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.483204 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhf57" event={"ID":"9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8","Type":"ContainerDied","Data":"314e7ae5e8c318b406a27018e35ca95ea3515be2ada0b9635693d16088ff1376"} Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.483220 4990 scope.go:117] "RemoveContainer" containerID="845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.483232 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhf57" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.511635 4990 scope.go:117] "RemoveContainer" containerID="15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.547680 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lhf57"] Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.548637 4990 scope.go:117] "RemoveContainer" containerID="4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.557963 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lhf57"] Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.592921 4990 scope.go:117] "RemoveContainer" containerID="845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01" Oct 03 11:05:30 crc kubenswrapper[4990]: E1003 11:05:30.594040 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01\": container with ID starting with 845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01 not found: ID does not exist" containerID="845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.594076 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01"} err="failed to get container status \"845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01\": rpc error: code = NotFound desc = could not find container \"845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01\": container with ID starting with 845fb2dc231f35291e3bcd3e77bdd600ea3b9dc9d970878895538b3e96c50b01 not found: ID does not exist" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.594099 4990 scope.go:117] "RemoveContainer" containerID="15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1" Oct 03 11:05:30 crc kubenswrapper[4990]: E1003 11:05:30.594563 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1\": container with ID starting with 15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1 not found: ID does not exist" containerID="15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.594595 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1"} err="failed to get container status \"15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1\": rpc error: code = NotFound desc = could not find container \"15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1\": container with ID starting with 15969c505a5dfac758ba38d05d605fee9fbd74fde4af59080a0c2763f19563f1 not found: ID does not exist" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.594621 4990 scope.go:117] "RemoveContainer" containerID="4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6" Oct 03 11:05:30 crc kubenswrapper[4990]: E1003 11:05:30.595161 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6\": container with ID starting with 4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6 not found: ID does not exist" containerID="4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.595236 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6"} err="failed to get container status \"4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6\": rpc error: code = NotFound desc = could not find container \"4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6\": container with ID starting with 4160e5b1f6e7c21c5a538d5d8361cf6ec3e0704ebe7e1b6e205919c7e00c35c6 not found: ID does not exist" Oct 03 11:05:30 crc kubenswrapper[4990]: I1003 11:05:30.891434 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" path="/var/lib/kubelet/pods/9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8/volumes" Oct 03 11:05:31 crc kubenswrapper[4990]: I1003 11:05:31.537838 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxxdb"] Oct 03 11:05:31 crc kubenswrapper[4990]: I1003 11:05:31.539000 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gxxdb" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerName="registry-server" containerID="cri-o://fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be" gracePeriod=2 Oct 03 11:05:31 crc kubenswrapper[4990]: I1003 11:05:31.996685 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.172443 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-catalog-content\") pod \"2fa102f6-408c-429c-abd0-b8a38d44ca80\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.172522 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-utilities\") pod \"2fa102f6-408c-429c-abd0-b8a38d44ca80\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.172645 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7kqg\" (UniqueName: \"kubernetes.io/projected/2fa102f6-408c-429c-abd0-b8a38d44ca80-kube-api-access-q7kqg\") pod \"2fa102f6-408c-429c-abd0-b8a38d44ca80\" (UID: \"2fa102f6-408c-429c-abd0-b8a38d44ca80\") " Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.174005 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-utilities" (OuterVolumeSpecName: "utilities") pod "2fa102f6-408c-429c-abd0-b8a38d44ca80" (UID: "2fa102f6-408c-429c-abd0-b8a38d44ca80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.183895 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa102f6-408c-429c-abd0-b8a38d44ca80-kube-api-access-q7kqg" (OuterVolumeSpecName: "kube-api-access-q7kqg") pod "2fa102f6-408c-429c-abd0-b8a38d44ca80" (UID: "2fa102f6-408c-429c-abd0-b8a38d44ca80"). InnerVolumeSpecName "kube-api-access-q7kqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.273832 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7kqg\" (UniqueName: \"kubernetes.io/projected/2fa102f6-408c-429c-abd0-b8a38d44ca80-kube-api-access-q7kqg\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.273868 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.274749 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fa102f6-408c-429c-abd0-b8a38d44ca80" (UID: "2fa102f6-408c-429c-abd0-b8a38d44ca80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.375237 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fa102f6-408c-429c-abd0-b8a38d44ca80-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.509693 4990 generic.go:334] "Generic (PLEG): container finished" podID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerID="fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be" exitCode=0 Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.509754 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxxdb" event={"ID":"2fa102f6-408c-429c-abd0-b8a38d44ca80","Type":"ContainerDied","Data":"fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be"} Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.509785 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxxdb" event={"ID":"2fa102f6-408c-429c-abd0-b8a38d44ca80","Type":"ContainerDied","Data":"e21cadfc96717a9e9dbc32beecb7db9c3c856833d3d1a54fe0b74c71eb637afe"} Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.509805 4990 scope.go:117] "RemoveContainer" containerID="fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.509968 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxxdb" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.544246 4990 scope.go:117] "RemoveContainer" containerID="26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.550904 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxxdb"] Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.558657 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gxxdb"] Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.585289 4990 scope.go:117] "RemoveContainer" containerID="6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.619856 4990 scope.go:117] "RemoveContainer" containerID="fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be" Oct 03 11:05:32 crc kubenswrapper[4990]: E1003 11:05:32.620465 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be\": container with ID starting with fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be not found: ID does not exist" containerID="fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.620543 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be"} err="failed to get container status \"fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be\": rpc error: code = NotFound desc = could not find container \"fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be\": container with ID starting with fa0c026982255be32b41c6cb70338907f55537a954b8525822d0d9db367b48be not found: ID does not exist" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.620579 4990 scope.go:117] "RemoveContainer" containerID="26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b" Oct 03 11:05:32 crc kubenswrapper[4990]: E1003 11:05:32.620920 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b\": container with ID starting with 26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b not found: ID does not exist" containerID="26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.620955 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b"} err="failed to get container status \"26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b\": rpc error: code = NotFound desc = could not find container \"26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b\": container with ID starting with 26a7b05580eaf3d96f433197ec31f8b109168506f67e71314ad8e522137edf0b not found: ID does not exist" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.620972 4990 scope.go:117] "RemoveContainer" containerID="6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f" Oct 03 11:05:32 crc kubenswrapper[4990]: E1003 11:05:32.621188 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f\": container with ID starting with 6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f not found: ID does not exist" containerID="6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.621217 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f"} err="failed to get container status \"6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f\": rpc error: code = NotFound desc = could not find container \"6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f\": container with ID starting with 6fd58c9c8994240fe045ac9f044eab953676711d547c6e9d7e1a59768b529a4f not found: ID does not exist" Oct 03 11:05:32 crc kubenswrapper[4990]: I1003 11:05:32.891200 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" path="/var/lib/kubelet/pods/2fa102f6-408c-429c-abd0-b8a38d44ca80/volumes" Oct 03 11:05:55 crc kubenswrapper[4990]: I1003 11:05:55.303937 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:05:55 crc kubenswrapper[4990]: I1003 11:05:55.304664 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.718736 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc6b6b45-4gwcb"] Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719272 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerName="extract-content" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719284 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerName="extract-content" Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719295 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerName="extract-utilities" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719301 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerName="extract-utilities" Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719314 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerName="extract-utilities" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719319 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerName="extract-utilities" Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719340 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerName="extract-content" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719347 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerName="extract-content" Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719369 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719377 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719389 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerName="extract-utilities" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719399 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerName="extract-utilities" Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719408 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719416 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719426 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerName="extract-content" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719433 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerName="extract-content" Oct 03 11:05:56 crc kubenswrapper[4990]: E1003 11:05:56.719447 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719454 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719625 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="25153d7d-a5bc-4a2f-971c-06db24ed01a2" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719643 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc32a49-ff9f-45cb-ad9c-6b3b1ea761e8" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.719659 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa102f6-408c-429c-abd0-b8a38d44ca80" containerName="registry-server" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.720350 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.723376 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.728148 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.728208 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.732946 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-n8pcr" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.733543 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ddbd7865f-4xdjq"] Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.734912 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.736425 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.739600 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc6b6b45-4gwcb"] Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.747756 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ddbd7865f-4xdjq"] Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.836271 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxvc\" (UniqueName: \"kubernetes.io/projected/1e936bb2-0804-4783-94e6-dcb8a6821e8f-kube-api-access-msxvc\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.836612 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-config\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.836757 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/154cb503-0430-4e3c-a832-a51782f5f025-config\") pod \"dnsmasq-dns-74dc6b6b45-4gwcb\" (UID: \"154cb503-0430-4e3c-a832-a51782f5f025\") " pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.836935 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-dns-svc\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.837127 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxst\" (UniqueName: \"kubernetes.io/projected/154cb503-0430-4e3c-a832-a51782f5f025-kube-api-access-pzxst\") pod \"dnsmasq-dns-74dc6b6b45-4gwcb\" (UID: \"154cb503-0430-4e3c-a832-a51782f5f025\") " pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.939071 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-dns-svc\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.939391 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxst\" (UniqueName: \"kubernetes.io/projected/154cb503-0430-4e3c-a832-a51782f5f025-kube-api-access-pzxst\") pod \"dnsmasq-dns-74dc6b6b45-4gwcb\" (UID: \"154cb503-0430-4e3c-a832-a51782f5f025\") " pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.939562 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxvc\" (UniqueName: \"kubernetes.io/projected/1e936bb2-0804-4783-94e6-dcb8a6821e8f-kube-api-access-msxvc\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.939710 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-config\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.939851 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/154cb503-0430-4e3c-a832-a51782f5f025-config\") pod \"dnsmasq-dns-74dc6b6b45-4gwcb\" (UID: \"154cb503-0430-4e3c-a832-a51782f5f025\") " pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.939960 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-dns-svc\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.940554 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-config\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.940560 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/154cb503-0430-4e3c-a832-a51782f5f025-config\") pod \"dnsmasq-dns-74dc6b6b45-4gwcb\" (UID: \"154cb503-0430-4e3c-a832-a51782f5f025\") " pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.969694 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxst\" (UniqueName: \"kubernetes.io/projected/154cb503-0430-4e3c-a832-a51782f5f025-kube-api-access-pzxst\") pod \"dnsmasq-dns-74dc6b6b45-4gwcb\" (UID: \"154cb503-0430-4e3c-a832-a51782f5f025\") " pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:56 crc kubenswrapper[4990]: I1003 11:05:56.978642 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxvc\" (UniqueName: \"kubernetes.io/projected/1e936bb2-0804-4783-94e6-dcb8a6821e8f-kube-api-access-msxvc\") pod \"dnsmasq-dns-7ddbd7865f-4xdjq\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.014009 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc6b6b45-4gwcb"] Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.014500 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.033068 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8459d5dff9-xjkzg"] Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.034182 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.049354 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.049364 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8459d5dff9-xjkzg"] Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.142125 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-config\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.142866 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-dns-svc\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.142918 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9r9l\" (UniqueName: \"kubernetes.io/projected/c0f83368-8781-4278-905f-0b405c5f4928-kube-api-access-b9r9l\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.243950 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9r9l\" (UniqueName: \"kubernetes.io/projected/c0f83368-8781-4278-905f-0b405c5f4928-kube-api-access-b9r9l\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.244045 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-config\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.244079 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-dns-svc\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.244912 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-dns-svc\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.245588 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-config\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.266738 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9r9l\" (UniqueName: \"kubernetes.io/projected/c0f83368-8781-4278-905f-0b405c5f4928-kube-api-access-b9r9l\") pod \"dnsmasq-dns-8459d5dff9-xjkzg\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.309681 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ddbd7865f-4xdjq"] Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.334836 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b77d44889-psgnq"] Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.335977 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.352813 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b77d44889-psgnq"] Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.426668 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.452131 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-config\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.452170 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-dns-svc\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.452189 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5pzw\" (UniqueName: \"kubernetes.io/projected/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-kube-api-access-n5pzw\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.527623 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc6b6b45-4gwcb"] Oct 03 11:05:57 crc kubenswrapper[4990]: W1003 11:05:57.535162 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod154cb503_0430_4e3c_a832_a51782f5f025.slice/crio-8b5aa3d41de0d35937f64261a01f35446356cab09ed6f9f377525e1ca80f40fb WatchSource:0}: Error finding container 8b5aa3d41de0d35937f64261a01f35446356cab09ed6f9f377525e1ca80f40fb: Status 404 returned error can't find the container with id 8b5aa3d41de0d35937f64261a01f35446356cab09ed6f9f377525e1ca80f40fb Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.553840 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-dns-svc\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.553885 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-config\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.553920 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5pzw\" (UniqueName: \"kubernetes.io/projected/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-kube-api-access-n5pzw\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.554805 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-dns-svc\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.555470 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-config\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.571679 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5pzw\" (UniqueName: \"kubernetes.io/projected/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-kube-api-access-n5pzw\") pod \"dnsmasq-dns-5b77d44889-psgnq\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.649631 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ddbd7865f-4xdjq"] Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.694457 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.716304 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" event={"ID":"154cb503-0430-4e3c-a832-a51782f5f025","Type":"ContainerStarted","Data":"8b5aa3d41de0d35937f64261a01f35446356cab09ed6f9f377525e1ca80f40fb"} Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.717108 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" event={"ID":"1e936bb2-0804-4783-94e6-dcb8a6821e8f","Type":"ContainerStarted","Data":"4d185abe8ceab7487cb5e885f6a5926f0a6512e46428eeb664f56e90974689e1"} Oct 03 11:05:57 crc kubenswrapper[4990]: I1003 11:05:57.892157 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8459d5dff9-xjkzg"] Oct 03 11:05:57 crc kubenswrapper[4990]: W1003 11:05:57.976030 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0f83368_8781_4278_905f_0b405c5f4928.slice/crio-3e0db2ce06fd175100e17836349b4ace95954e43a72e03173fab4230d658396f WatchSource:0}: Error finding container 3e0db2ce06fd175100e17836349b4ace95954e43a72e03173fab4230d658396f: Status 404 returned error can't find the container with id 3e0db2ce06fd175100e17836349b4ace95954e43a72e03173fab4230d658396f Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.138660 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b77d44889-psgnq"] Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.199471 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.203696 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.206382 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.206603 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z4z2f" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.207596 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.207785 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.207990 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.208146 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.208281 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.232968 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.264346 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.264437 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.264570 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.264700 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.264738 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-config-data\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.264772 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.264820 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.264947 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.265027 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.265073 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.265103 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptwfh\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-kube-api-access-ptwfh\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.365885 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.365964 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.365993 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.366014 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptwfh\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-kube-api-access-ptwfh\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.366041 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.366071 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.366093 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.366128 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.366151 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-config-data\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.366174 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.366227 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.368123 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.377372 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.378599 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.379336 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.383952 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.384708 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.385017 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-config-data\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.385074 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.393908 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.403753 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.403794 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d1a63a80764637e6840e48d066d143b6c2316b148a10b0e73149fc27a3797c67/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.411226 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptwfh\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-kube-api-access-ptwfh\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.475946 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.477051 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.490957 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.491114 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.491214 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.491252 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-v95t6" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.491603 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.491830 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.491931 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.500090 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.549563 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"rabbitmq-server-0\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569122 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569165 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afeb5ad-0968-47ee-af7f-fc8506997433-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569191 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569223 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569264 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569402 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afeb5ad-0968-47ee-af7f-fc8506997433-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569472 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569586 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569647 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569695 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.569863 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2qck\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-kube-api-access-m2qck\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.658853 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670417 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670472 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670528 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670553 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qck\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-kube-api-access-m2qck\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670586 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670607 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afeb5ad-0968-47ee-af7f-fc8506997433-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670624 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670645 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670674 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670691 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afeb5ad-0968-47ee-af7f-fc8506997433-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.670710 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.671550 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.671777 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.672021 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.672675 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.673704 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.673731 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/48c415e463b64b7f7cdde39f2ccb21f532e0762692d5d7249f65ab51659afcd4/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.674391 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.675170 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.676371 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afeb5ad-0968-47ee-af7f-fc8506997433-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.676777 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.680667 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afeb5ad-0968-47ee-af7f-fc8506997433-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.699468 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.708059 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qck\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-kube-api-access-m2qck\") pod \"rabbitmq-cell1-server-0\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.728812 4990 generic.go:334] "Generic (PLEG): container finished" podID="154cb503-0430-4e3c-a832-a51782f5f025" containerID="a25f69f40cc62ad19067eb8c70a0a6fd5ecb7fc45680f9c1ec282353850af928" exitCode=0 Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.728858 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" event={"ID":"154cb503-0430-4e3c-a832-a51782f5f025","Type":"ContainerDied","Data":"a25f69f40cc62ad19067eb8c70a0a6fd5ecb7fc45680f9c1ec282353850af928"} Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.731172 4990 generic.go:334] "Generic (PLEG): container finished" podID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" containerID="580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9" exitCode=0 Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.731213 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" event={"ID":"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7","Type":"ContainerDied","Data":"580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9"} Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.731226 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" event={"ID":"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7","Type":"ContainerStarted","Data":"1742ff18a33f12e7bc1d0fe7cc2b78e5be0da70bc34cebacca1822c59c34edcc"} Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.733852 4990 generic.go:334] "Generic (PLEG): container finished" podID="1e936bb2-0804-4783-94e6-dcb8a6821e8f" containerID="914e6208721671d8acb1cf4c4f6e601eded2ef7b635ba48257152edfd46bdf19" exitCode=0 Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.733902 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" event={"ID":"1e936bb2-0804-4783-94e6-dcb8a6821e8f","Type":"ContainerDied","Data":"914e6208721671d8acb1cf4c4f6e601eded2ef7b635ba48257152edfd46bdf19"} Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.735880 4990 generic.go:334] "Generic (PLEG): container finished" podID="c0f83368-8781-4278-905f-0b405c5f4928" containerID="0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4" exitCode=0 Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.735913 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" event={"ID":"c0f83368-8781-4278-905f-0b405c5f4928","Type":"ContainerDied","Data":"0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4"} Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.735935 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" event={"ID":"c0f83368-8781-4278-905f-0b405c5f4928","Type":"ContainerStarted","Data":"3e0db2ce06fd175100e17836349b4ace95954e43a72e03173fab4230d658396f"} Oct 03 11:05:58 crc kubenswrapper[4990]: I1003 11:05:58.826841 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:05:58 crc kubenswrapper[4990]: E1003 11:05:58.962999 4990 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 03 11:05:58 crc kubenswrapper[4990]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c0f83368-8781-4278-905f-0b405c5f4928/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 11:05:58 crc kubenswrapper[4990]: > podSandboxID="3e0db2ce06fd175100e17836349b4ace95954e43a72e03173fab4230d658396f" Oct 03 11:05:58 crc kubenswrapper[4990]: E1003 11:05:58.963145 4990 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 11:05:58 crc kubenswrapper[4990]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b9r9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8459d5dff9-xjkzg_openstack(c0f83368-8781-4278-905f-0b405c5f4928): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c0f83368-8781-4278-905f-0b405c5f4928/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 11:05:58 crc kubenswrapper[4990]: > logger="UnhandledError" Oct 03 11:05:58 crc kubenswrapper[4990]: E1003 11:05:58.964268 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c0f83368-8781-4278-905f-0b405c5f4928/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" podUID="c0f83368-8781-4278-905f-0b405c5f4928" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.152423 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.296066 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.299523 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.356133 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.383707 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/154cb503-0430-4e3c-a832-a51782f5f025-config\") pod \"154cb503-0430-4e3c-a832-a51782f5f025\" (UID: \"154cb503-0430-4e3c-a832-a51782f5f025\") " Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.383750 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-config\") pod \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.383773 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-dns-svc\") pod \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.383842 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msxvc\" (UniqueName: \"kubernetes.io/projected/1e936bb2-0804-4783-94e6-dcb8a6821e8f-kube-api-access-msxvc\") pod \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\" (UID: \"1e936bb2-0804-4783-94e6-dcb8a6821e8f\") " Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.383860 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzxst\" (UniqueName: \"kubernetes.io/projected/154cb503-0430-4e3c-a832-a51782f5f025-kube-api-access-pzxst\") pod \"154cb503-0430-4e3c-a832-a51782f5f025\" (UID: \"154cb503-0430-4e3c-a832-a51782f5f025\") " Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.391540 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e936bb2-0804-4783-94e6-dcb8a6821e8f-kube-api-access-msxvc" (OuterVolumeSpecName: "kube-api-access-msxvc") pod "1e936bb2-0804-4783-94e6-dcb8a6821e8f" (UID: "1e936bb2-0804-4783-94e6-dcb8a6821e8f"). InnerVolumeSpecName "kube-api-access-msxvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.391815 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154cb503-0430-4e3c-a832-a51782f5f025-kube-api-access-pzxst" (OuterVolumeSpecName: "kube-api-access-pzxst") pod "154cb503-0430-4e3c-a832-a51782f5f025" (UID: "154cb503-0430-4e3c-a832-a51782f5f025"). InnerVolumeSpecName "kube-api-access-pzxst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.403395 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-config" (OuterVolumeSpecName: "config") pod "1e936bb2-0804-4783-94e6-dcb8a6821e8f" (UID: "1e936bb2-0804-4783-94e6-dcb8a6821e8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.415746 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e936bb2-0804-4783-94e6-dcb8a6821e8f" (UID: "1e936bb2-0804-4783-94e6-dcb8a6821e8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.418954 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/154cb503-0430-4e3c-a832-a51782f5f025-config" (OuterVolumeSpecName: "config") pod "154cb503-0430-4e3c-a832-a51782f5f025" (UID: "154cb503-0430-4e3c-a832-a51782f5f025"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.485367 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/154cb503-0430-4e3c-a832-a51782f5f025-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.485398 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.485407 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e936bb2-0804-4783-94e6-dcb8a6821e8f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.485420 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msxvc\" (UniqueName: \"kubernetes.io/projected/1e936bb2-0804-4783-94e6-dcb8a6821e8f-kube-api-access-msxvc\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.485429 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzxst\" (UniqueName: \"kubernetes.io/projected/154cb503-0430-4e3c-a832-a51782f5f025-kube-api-access-pzxst\") on node \"crc\" DevicePath \"\"" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.743843 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72eecd3c-f6f7-436c-af04-ef2126ea0c8b","Type":"ContainerStarted","Data":"1ab4d9a5c7f36ef860bcc29377af6b03eeb7b379389dcef8f77139f32ce19e2f"} Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.745204 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" event={"ID":"154cb503-0430-4e3c-a832-a51782f5f025","Type":"ContainerDied","Data":"8b5aa3d41de0d35937f64261a01f35446356cab09ed6f9f377525e1ca80f40fb"} Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.745228 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc6b6b45-4gwcb" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.745278 4990 scope.go:117] "RemoveContainer" containerID="a25f69f40cc62ad19067eb8c70a0a6fd5ecb7fc45680f9c1ec282353850af928" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.746178 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afeb5ad-0968-47ee-af7f-fc8506997433","Type":"ContainerStarted","Data":"724fee61fa3828bcd01cf199d6b9734b190886b1f5d15e04eee9472ae875c250"} Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.750022 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" event={"ID":"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7","Type":"ContainerStarted","Data":"8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a"} Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.750073 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.753457 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.753828 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddbd7865f-4xdjq" event={"ID":"1e936bb2-0804-4783-94e6-dcb8a6821e8f","Type":"ContainerDied","Data":"4d185abe8ceab7487cb5e885f6a5926f0a6512e46428eeb664f56e90974689e1"} Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.770763 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" podStartSLOduration=2.7707422250000002 podStartE2EDuration="2.770742225s" podCreationTimestamp="2025-10-03 11:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:05:59.769649377 +0000 UTC m=+4941.566281254" watchObservedRunningTime="2025-10-03 11:05:59.770742225 +0000 UTC m=+4941.567374102" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.781039 4990 scope.go:117] "RemoveContainer" containerID="914e6208721671d8acb1cf4c4f6e601eded2ef7b635ba48257152edfd46bdf19" Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.852636 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ddbd7865f-4xdjq"] Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.854564 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ddbd7865f-4xdjq"] Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.885947 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc6b6b45-4gwcb"] Oct 03 11:05:59 crc kubenswrapper[4990]: I1003 11:05:59.890486 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc6b6b45-4gwcb"] Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.400466 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 11:06:00 crc kubenswrapper[4990]: E1003 11:06:00.401050 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154cb503-0430-4e3c-a832-a51782f5f025" containerName="init" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.401064 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="154cb503-0430-4e3c-a832-a51782f5f025" containerName="init" Oct 03 11:06:00 crc kubenswrapper[4990]: E1003 11:06:00.401099 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e936bb2-0804-4783-94e6-dcb8a6821e8f" containerName="init" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.401107 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e936bb2-0804-4783-94e6-dcb8a6821e8f" containerName="init" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.401271 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="154cb503-0430-4e3c-a832-a51782f5f025" containerName="init" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.401290 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e936bb2-0804-4783-94e6-dcb8a6821e8f" containerName="init" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.402162 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.404414 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.404494 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-9wvrq" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.405299 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.405597 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.407652 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.419648 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.424082 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512015 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512063 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-be66a20d-d477-4876-9359-13ed8dbe1edc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be66a20d-d477-4876-9359-13ed8dbe1edc\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512110 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512139 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512176 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512202 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512232 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz4xb\" (UniqueName: \"kubernetes.io/projected/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-kube-api-access-pz4xb\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512256 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.512273 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-secrets\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613571 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz4xb\" (UniqueName: \"kubernetes.io/projected/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-kube-api-access-pz4xb\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613621 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613644 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-secrets\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613667 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613686 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-be66a20d-d477-4876-9359-13ed8dbe1edc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be66a20d-d477-4876-9359-13ed8dbe1edc\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613712 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613738 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613774 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.613798 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.614527 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.616863 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.617552 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.618028 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.618063 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-be66a20d-d477-4876-9359-13ed8dbe1edc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be66a20d-d477-4876-9359-13ed8dbe1edc\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce4a8233c957675789e21af2b1b33083c18580cf593d53b7a769abb1e1546b9d/globalmount\"" pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.618825 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.619134 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.619567 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-secrets\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.623054 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.643686 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz4xb\" (UniqueName: \"kubernetes.io/projected/c2f2afdb-bc52-41b3-bfb2-05cd189491ea-kube-api-access-pz4xb\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.661991 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-be66a20d-d477-4876-9359-13ed8dbe1edc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-be66a20d-d477-4876-9359-13ed8dbe1edc\") pod \"openstack-galera-0\" (UID: \"c2f2afdb-bc52-41b3-bfb2-05cd189491ea\") " pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.715560 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.787340 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" event={"ID":"c0f83368-8781-4278-905f-0b405c5f4928","Type":"ContainerStarted","Data":"d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66"} Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.787683 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.789442 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72eecd3c-f6f7-436c-af04-ef2126ea0c8b","Type":"ContainerStarted","Data":"7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4"} Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.794223 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afeb5ad-0968-47ee-af7f-fc8506997433","Type":"ContainerStarted","Data":"f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0"} Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.841726 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" podStartSLOduration=3.84170625 podStartE2EDuration="3.84170625s" podCreationTimestamp="2025-10-03 11:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:06:00.810052068 +0000 UTC m=+4942.606683965" watchObservedRunningTime="2025-10-03 11:06:00.84170625 +0000 UTC m=+4942.638338127" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.890711 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154cb503-0430-4e3c-a832-a51782f5f025" path="/var/lib/kubelet/pods/154cb503-0430-4e3c-a832-a51782f5f025/volumes" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.891176 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e936bb2-0804-4783-94e6-dcb8a6821e8f" path="/var/lib/kubelet/pods/1e936bb2-0804-4783-94e6-dcb8a6821e8f/volumes" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.973000 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.974126 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.976431 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.977081 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-r98nd" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.977834 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.977832 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 11:06:00 crc kubenswrapper[4990]: I1003 11:06:00.989403 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.133228 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.133595 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.133678 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k88vk\" (UniqueName: \"kubernetes.io/projected/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-kube-api-access-k88vk\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.133777 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.133860 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.133954 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.134054 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.134153 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.134605 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-319caca0-26b6-47ca-885f-d18df30341ef\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caca0-26b6-47ca-885f-d18df30341ef\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.206590 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.235932 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.236052 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.236129 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k88vk\" (UniqueName: \"kubernetes.io/projected/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-kube-api-access-k88vk\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.236216 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.236303 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.236389 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.236472 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.236581 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.236668 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-319caca0-26b6-47ca-885f-d18df30341ef\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caca0-26b6-47ca-885f-d18df30341ef\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.237451 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.237775 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.239028 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.239266 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.240838 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.241005 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.241242 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.241998 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.242029 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-319caca0-26b6-47ca-885f-d18df30341ef\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caca0-26b6-47ca-885f-d18df30341ef\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e5a5e7d6f162f4be91a7f04039b90a27e60b50f65aa452a3acc3910585b0d4e/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.254942 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k88vk\" (UniqueName: \"kubernetes.io/projected/8b68b675-4162-4159-9d49-ab3ee3c5a6e2-kube-api-access-k88vk\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.274755 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-319caca0-26b6-47ca-885f-d18df30341ef\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-319caca0-26b6-47ca-885f-d18df30341ef\") pod \"openstack-cell1-galera-0\" (UID: \"8b68b675-4162-4159-9d49-ab3ee3c5a6e2\") " pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.298935 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.346219 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.347573 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.355960 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-b9m8d" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.361181 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.363653 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.370248 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.540735 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c316927-e4ca-496b-96a2-f31602456e6d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.541191 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c316927-e4ca-496b-96a2-f31602456e6d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.541224 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmtqv\" (UniqueName: \"kubernetes.io/projected/8c316927-e4ca-496b-96a2-f31602456e6d-kube-api-access-nmtqv\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.541241 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c316927-e4ca-496b-96a2-f31602456e6d-config-data\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.541279 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c316927-e4ca-496b-96a2-f31602456e6d-kolla-config\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.642994 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c316927-e4ca-496b-96a2-f31602456e6d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.643078 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmtqv\" (UniqueName: \"kubernetes.io/projected/8c316927-e4ca-496b-96a2-f31602456e6d-kube-api-access-nmtqv\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.643116 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c316927-e4ca-496b-96a2-f31602456e6d-config-data\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.643194 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c316927-e4ca-496b-96a2-f31602456e6d-kolla-config\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.643255 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c316927-e4ca-496b-96a2-f31602456e6d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.644215 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c316927-e4ca-496b-96a2-f31602456e6d-kolla-config\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.644337 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c316927-e4ca-496b-96a2-f31602456e6d-config-data\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.649220 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c316927-e4ca-496b-96a2-f31602456e6d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.652132 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c316927-e4ca-496b-96a2-f31602456e6d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.664217 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmtqv\" (UniqueName: \"kubernetes.io/projected/8c316927-e4ca-496b-96a2-f31602456e6d-kube-api-access-nmtqv\") pod \"memcached-0\" (UID: \"8c316927-e4ca-496b-96a2-f31602456e6d\") " pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.749529 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.775678 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 11:06:01 crc kubenswrapper[4990]: W1003 11:06:01.787895 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b68b675_4162_4159_9d49_ab3ee3c5a6e2.slice/crio-5ed9dc508df6bc1896900bbb264378181761662cf375544a86a53b5593ed9bfb WatchSource:0}: Error finding container 5ed9dc508df6bc1896900bbb264378181761662cf375544a86a53b5593ed9bfb: Status 404 returned error can't find the container with id 5ed9dc508df6bc1896900bbb264378181761662cf375544a86a53b5593ed9bfb Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.811788 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c2f2afdb-bc52-41b3-bfb2-05cd189491ea","Type":"ContainerStarted","Data":"93e9dbe2cdbf76a55730c6420cf3ea3cd78fdecfc89168f342bfae3666bd0a3c"} Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.811854 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c2f2afdb-bc52-41b3-bfb2-05cd189491ea","Type":"ContainerStarted","Data":"6b2031585dbd274447a131a2daf18dc1ace5249cd63a145676a6e312851b755c"} Oct 03 11:06:01 crc kubenswrapper[4990]: I1003 11:06:01.814228 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b68b675-4162-4159-9d49-ab3ee3c5a6e2","Type":"ContainerStarted","Data":"5ed9dc508df6bc1896900bbb264378181761662cf375544a86a53b5593ed9bfb"} Oct 03 11:06:02 crc kubenswrapper[4990]: I1003 11:06:02.039793 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 11:06:02 crc kubenswrapper[4990]: W1003 11:06:02.041858 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c316927_e4ca_496b_96a2_f31602456e6d.slice/crio-5a102cf47a17e9267ba99465207ff837deca8b6b05e7f569f1f98c092c722105 WatchSource:0}: Error finding container 5a102cf47a17e9267ba99465207ff837deca8b6b05e7f569f1f98c092c722105: Status 404 returned error can't find the container with id 5a102cf47a17e9267ba99465207ff837deca8b6b05e7f569f1f98c092c722105 Oct 03 11:06:02 crc kubenswrapper[4990]: I1003 11:06:02.826926 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b68b675-4162-4159-9d49-ab3ee3c5a6e2","Type":"ContainerStarted","Data":"14338934b3bc60513713f9d99a415ea698e7e7fc4db7d4ff50af5dc727f867ea"} Oct 03 11:06:02 crc kubenswrapper[4990]: I1003 11:06:02.829552 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8c316927-e4ca-496b-96a2-f31602456e6d","Type":"ContainerStarted","Data":"947681a511028b31f83b2c2355409ea98d0d11f1dccf3381617821645ac59f38"} Oct 03 11:06:02 crc kubenswrapper[4990]: I1003 11:06:02.829607 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8c316927-e4ca-496b-96a2-f31602456e6d","Type":"ContainerStarted","Data":"5a102cf47a17e9267ba99465207ff837deca8b6b05e7f569f1f98c092c722105"} Oct 03 11:06:02 crc kubenswrapper[4990]: I1003 11:06:02.829897 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 11:06:02 crc kubenswrapper[4990]: I1003 11:06:02.890983 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.890960553 podStartE2EDuration="1.890960553s" podCreationTimestamp="2025-10-03 11:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:06:02.88733081 +0000 UTC m=+4944.683962747" watchObservedRunningTime="2025-10-03 11:06:02.890960553 +0000 UTC m=+4944.687592420" Oct 03 11:06:04 crc kubenswrapper[4990]: I1003 11:06:04.853977 4990 generic.go:334] "Generic (PLEG): container finished" podID="c2f2afdb-bc52-41b3-bfb2-05cd189491ea" containerID="93e9dbe2cdbf76a55730c6420cf3ea3cd78fdecfc89168f342bfae3666bd0a3c" exitCode=0 Oct 03 11:06:04 crc kubenswrapper[4990]: I1003 11:06:04.854103 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c2f2afdb-bc52-41b3-bfb2-05cd189491ea","Type":"ContainerDied","Data":"93e9dbe2cdbf76a55730c6420cf3ea3cd78fdecfc89168f342bfae3666bd0a3c"} Oct 03 11:06:05 crc kubenswrapper[4990]: I1003 11:06:05.864944 4990 generic.go:334] "Generic (PLEG): container finished" podID="8b68b675-4162-4159-9d49-ab3ee3c5a6e2" containerID="14338934b3bc60513713f9d99a415ea698e7e7fc4db7d4ff50af5dc727f867ea" exitCode=0 Oct 03 11:06:05 crc kubenswrapper[4990]: I1003 11:06:05.865010 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b68b675-4162-4159-9d49-ab3ee3c5a6e2","Type":"ContainerDied","Data":"14338934b3bc60513713f9d99a415ea698e7e7fc4db7d4ff50af5dc727f867ea"} Oct 03 11:06:05 crc kubenswrapper[4990]: I1003 11:06:05.869127 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"c2f2afdb-bc52-41b3-bfb2-05cd189491ea","Type":"ContainerStarted","Data":"a82725f8f00f2d0e1211fb4d8b591057c84741c2dbeaed7805a7fb1ed0c59406"} Oct 03 11:06:05 crc kubenswrapper[4990]: I1003 11:06:05.937238 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=6.937224147 podStartE2EDuration="6.937224147s" podCreationTimestamp="2025-10-03 11:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:06:05.935557734 +0000 UTC m=+4947.732189661" watchObservedRunningTime="2025-10-03 11:06:05.937224147 +0000 UTC m=+4947.733856004" Oct 03 11:06:06 crc kubenswrapper[4990]: I1003 11:06:06.884954 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8b68b675-4162-4159-9d49-ab3ee3c5a6e2","Type":"ContainerStarted","Data":"8c2311cfd59f6cfb4ef951a1e544e3bd714cf4c09ef780b0686038381260e095"} Oct 03 11:06:06 crc kubenswrapper[4990]: I1003 11:06:06.921129 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.92109868 podStartE2EDuration="7.92109868s" podCreationTimestamp="2025-10-03 11:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:06:06.915211329 +0000 UTC m=+4948.711843226" watchObservedRunningTime="2025-10-03 11:06:06.92109868 +0000 UTC m=+4948.717730537" Oct 03 11:06:07 crc kubenswrapper[4990]: I1003 11:06:07.429412 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:06:07 crc kubenswrapper[4990]: I1003 11:06:07.696144 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:06:07 crc kubenswrapper[4990]: I1003 11:06:07.757025 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8459d5dff9-xjkzg"] Oct 03 11:06:07 crc kubenswrapper[4990]: I1003 11:06:07.888448 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" podUID="c0f83368-8781-4278-905f-0b405c5f4928" containerName="dnsmasq-dns" containerID="cri-o://d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66" gracePeriod=10 Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.366133 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.461713 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9r9l\" (UniqueName: \"kubernetes.io/projected/c0f83368-8781-4278-905f-0b405c5f4928-kube-api-access-b9r9l\") pod \"c0f83368-8781-4278-905f-0b405c5f4928\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.461829 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-dns-svc\") pod \"c0f83368-8781-4278-905f-0b405c5f4928\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.461873 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-config\") pod \"c0f83368-8781-4278-905f-0b405c5f4928\" (UID: \"c0f83368-8781-4278-905f-0b405c5f4928\") " Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.466807 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f83368-8781-4278-905f-0b405c5f4928-kube-api-access-b9r9l" (OuterVolumeSpecName: "kube-api-access-b9r9l") pod "c0f83368-8781-4278-905f-0b405c5f4928" (UID: "c0f83368-8781-4278-905f-0b405c5f4928"). InnerVolumeSpecName "kube-api-access-b9r9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.496655 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-config" (OuterVolumeSpecName: "config") pod "c0f83368-8781-4278-905f-0b405c5f4928" (UID: "c0f83368-8781-4278-905f-0b405c5f4928"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.499311 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0f83368-8781-4278-905f-0b405c5f4928" (UID: "c0f83368-8781-4278-905f-0b405c5f4928"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.563259 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9r9l\" (UniqueName: \"kubernetes.io/projected/c0f83368-8781-4278-905f-0b405c5f4928-kube-api-access-b9r9l\") on node \"crc\" DevicePath \"\"" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.563299 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.563313 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0f83368-8781-4278-905f-0b405c5f4928-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.897984 4990 generic.go:334] "Generic (PLEG): container finished" podID="c0f83368-8781-4278-905f-0b405c5f4928" containerID="d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66" exitCode=0 Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.898035 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.898038 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" event={"ID":"c0f83368-8781-4278-905f-0b405c5f4928","Type":"ContainerDied","Data":"d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66"} Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.898217 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459d5dff9-xjkzg" event={"ID":"c0f83368-8781-4278-905f-0b405c5f4928","Type":"ContainerDied","Data":"3e0db2ce06fd175100e17836349b4ace95954e43a72e03173fab4230d658396f"} Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.898256 4990 scope.go:117] "RemoveContainer" containerID="d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.923403 4990 scope.go:117] "RemoveContainer" containerID="0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.926327 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8459d5dff9-xjkzg"] Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.931174 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8459d5dff9-xjkzg"] Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.943834 4990 scope.go:117] "RemoveContainer" containerID="d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66" Oct 03 11:06:08 crc kubenswrapper[4990]: E1003 11:06:08.944473 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66\": container with ID starting with d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66 not found: ID does not exist" containerID="d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.944557 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66"} err="failed to get container status \"d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66\": rpc error: code = NotFound desc = could not find container \"d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66\": container with ID starting with d263d101cc66a42f005477f317c6e64b1b4735adc97d558a165412a4f9491d66 not found: ID does not exist" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.944601 4990 scope.go:117] "RemoveContainer" containerID="0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4" Oct 03 11:06:08 crc kubenswrapper[4990]: E1003 11:06:08.945213 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4\": container with ID starting with 0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4 not found: ID does not exist" containerID="0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4" Oct 03 11:06:08 crc kubenswrapper[4990]: I1003 11:06:08.945260 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4"} err="failed to get container status \"0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4\": rpc error: code = NotFound desc = could not find container \"0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4\": container with ID starting with 0fb51655eb7e871359de13b2b3dcdadf3e9eceb0f80bfc68f86a082c7e5079a4 not found: ID does not exist" Oct 03 11:06:10 crc kubenswrapper[4990]: I1003 11:06:10.715870 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 11:06:10 crc kubenswrapper[4990]: I1003 11:06:10.717780 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 11:06:10 crc kubenswrapper[4990]: I1003 11:06:10.888080 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f83368-8781-4278-905f-0b405c5f4928" path="/var/lib/kubelet/pods/c0f83368-8781-4278-905f-0b405c5f4928/volumes" Oct 03 11:06:11 crc kubenswrapper[4990]: I1003 11:06:11.299924 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:11 crc kubenswrapper[4990]: I1003 11:06:11.299989 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:11 crc kubenswrapper[4990]: I1003 11:06:11.751736 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 11:06:12 crc kubenswrapper[4990]: I1003 11:06:12.807402 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 11:06:12 crc kubenswrapper[4990]: I1003 11:06:12.892396 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 11:06:13 crc kubenswrapper[4990]: I1003 11:06:13.388905 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:13 crc kubenswrapper[4990]: I1003 11:06:13.478177 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 11:06:25 crc kubenswrapper[4990]: I1003 11:06:25.304212 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:06:25 crc kubenswrapper[4990]: I1003 11:06:25.304757 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.025004 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kgztp"] Oct 03 11:06:30 crc kubenswrapper[4990]: E1003 11:06:30.026151 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f83368-8781-4278-905f-0b405c5f4928" containerName="dnsmasq-dns" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.026174 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f83368-8781-4278-905f-0b405c5f4928" containerName="dnsmasq-dns" Oct 03 11:06:30 crc kubenswrapper[4990]: E1003 11:06:30.026199 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f83368-8781-4278-905f-0b405c5f4928" containerName="init" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.026212 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f83368-8781-4278-905f-0b405c5f4928" containerName="init" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.026652 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f83368-8781-4278-905f-0b405c5f4928" containerName="dnsmasq-dns" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.029398 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.041633 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgztp"] Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.041783 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkx5d\" (UniqueName: \"kubernetes.io/projected/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-kube-api-access-xkx5d\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.041848 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-utilities\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.042017 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-catalog-content\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.143274 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkx5d\" (UniqueName: \"kubernetes.io/projected/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-kube-api-access-xkx5d\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.143632 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-utilities\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.143905 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-catalog-content\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.144456 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-utilities\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.144475 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-catalog-content\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.165845 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkx5d\" (UniqueName: \"kubernetes.io/projected/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-kube-api-access-xkx5d\") pod \"redhat-marketplace-kgztp\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.349539 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:30 crc kubenswrapper[4990]: I1003 11:06:30.823345 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgztp"] Oct 03 11:06:31 crc kubenswrapper[4990]: I1003 11:06:31.115733 4990 generic.go:334] "Generic (PLEG): container finished" podID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerID="f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461" exitCode=0 Oct 03 11:06:31 crc kubenswrapper[4990]: I1003 11:06:31.115798 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgztp" event={"ID":"750aefb6-6ba7-4afb-b623-83ac34d4b5a6","Type":"ContainerDied","Data":"f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461"} Oct 03 11:06:31 crc kubenswrapper[4990]: I1003 11:06:31.115834 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgztp" event={"ID":"750aefb6-6ba7-4afb-b623-83ac34d4b5a6","Type":"ContainerStarted","Data":"ba61bde51780574574c1484358c87f29dba364de09e95bd491c8b7fcad3da836"} Oct 03 11:06:32 crc kubenswrapper[4990]: I1003 11:06:32.128445 4990 generic.go:334] "Generic (PLEG): container finished" podID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerID="6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0" exitCode=0 Oct 03 11:06:32 crc kubenswrapper[4990]: I1003 11:06:32.128535 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgztp" event={"ID":"750aefb6-6ba7-4afb-b623-83ac34d4b5a6","Type":"ContainerDied","Data":"6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0"} Oct 03 11:06:33 crc kubenswrapper[4990]: I1003 11:06:33.140986 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgztp" event={"ID":"750aefb6-6ba7-4afb-b623-83ac34d4b5a6","Type":"ContainerStarted","Data":"cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da"} Oct 03 11:06:33 crc kubenswrapper[4990]: I1003 11:06:33.142976 4990 generic.go:334] "Generic (PLEG): container finished" podID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" containerID="7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4" exitCode=0 Oct 03 11:06:33 crc kubenswrapper[4990]: I1003 11:06:33.143084 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72eecd3c-f6f7-436c-af04-ef2126ea0c8b","Type":"ContainerDied","Data":"7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4"} Oct 03 11:06:33 crc kubenswrapper[4990]: I1003 11:06:33.144812 4990 generic.go:334] "Generic (PLEG): container finished" podID="3afeb5ad-0968-47ee-af7f-fc8506997433" containerID="f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0" exitCode=0 Oct 03 11:06:33 crc kubenswrapper[4990]: I1003 11:06:33.144862 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afeb5ad-0968-47ee-af7f-fc8506997433","Type":"ContainerDied","Data":"f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0"} Oct 03 11:06:33 crc kubenswrapper[4990]: I1003 11:06:33.179748 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kgztp" podStartSLOduration=1.738630924 podStartE2EDuration="3.179710984s" podCreationTimestamp="2025-10-03 11:06:30 +0000 UTC" firstStartedPulling="2025-10-03 11:06:31.117981669 +0000 UTC m=+4972.914613556" lastFinishedPulling="2025-10-03 11:06:32.559061719 +0000 UTC m=+4974.355693616" observedRunningTime="2025-10-03 11:06:33.16630812 +0000 UTC m=+4974.962940007" watchObservedRunningTime="2025-10-03 11:06:33.179710984 +0000 UTC m=+4974.976342871" Oct 03 11:06:34 crc kubenswrapper[4990]: I1003 11:06:34.158133 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72eecd3c-f6f7-436c-af04-ef2126ea0c8b","Type":"ContainerStarted","Data":"93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e"} Oct 03 11:06:34 crc kubenswrapper[4990]: I1003 11:06:34.158576 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 11:06:34 crc kubenswrapper[4990]: I1003 11:06:34.160805 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afeb5ad-0968-47ee-af7f-fc8506997433","Type":"ContainerStarted","Data":"cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07"} Oct 03 11:06:34 crc kubenswrapper[4990]: I1003 11:06:34.161029 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:06:34 crc kubenswrapper[4990]: I1003 11:06:34.182791 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.182760778 podStartE2EDuration="37.182760778s" podCreationTimestamp="2025-10-03 11:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:06:34.178756826 +0000 UTC m=+4975.975388703" watchObservedRunningTime="2025-10-03 11:06:34.182760778 +0000 UTC m=+4975.979392685" Oct 03 11:06:34 crc kubenswrapper[4990]: I1003 11:06:34.211760 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.21170475 podStartE2EDuration="37.21170475s" podCreationTimestamp="2025-10-03 11:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:06:34.206825315 +0000 UTC m=+4976.003457192" watchObservedRunningTime="2025-10-03 11:06:34.21170475 +0000 UTC m=+4976.008336647" Oct 03 11:06:40 crc kubenswrapper[4990]: I1003 11:06:40.350574 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:40 crc kubenswrapper[4990]: I1003 11:06:40.351288 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:40 crc kubenswrapper[4990]: I1003 11:06:40.421576 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:41 crc kubenswrapper[4990]: I1003 11:06:41.291276 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:41 crc kubenswrapper[4990]: I1003 11:06:41.343595 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgztp"] Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.246323 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kgztp" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerName="registry-server" containerID="cri-o://cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da" gracePeriod=2 Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.726944 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.859087 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-utilities\") pod \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.859209 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkx5d\" (UniqueName: \"kubernetes.io/projected/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-kube-api-access-xkx5d\") pod \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.859261 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-catalog-content\") pod \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\" (UID: \"750aefb6-6ba7-4afb-b623-83ac34d4b5a6\") " Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.860385 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-utilities" (OuterVolumeSpecName: "utilities") pod "750aefb6-6ba7-4afb-b623-83ac34d4b5a6" (UID: "750aefb6-6ba7-4afb-b623-83ac34d4b5a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.867102 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-kube-api-access-xkx5d" (OuterVolumeSpecName: "kube-api-access-xkx5d") pod "750aefb6-6ba7-4afb-b623-83ac34d4b5a6" (UID: "750aefb6-6ba7-4afb-b623-83ac34d4b5a6"). InnerVolumeSpecName "kube-api-access-xkx5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.881250 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "750aefb6-6ba7-4afb-b623-83ac34d4b5a6" (UID: "750aefb6-6ba7-4afb-b623-83ac34d4b5a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.961458 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.961531 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkx5d\" (UniqueName: \"kubernetes.io/projected/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-kube-api-access-xkx5d\") on node \"crc\" DevicePath \"\"" Oct 03 11:06:43 crc kubenswrapper[4990]: I1003 11:06:43.961554 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/750aefb6-6ba7-4afb-b623-83ac34d4b5a6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.269931 4990 generic.go:334] "Generic (PLEG): container finished" podID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerID="cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da" exitCode=0 Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.270002 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgztp" event={"ID":"750aefb6-6ba7-4afb-b623-83ac34d4b5a6","Type":"ContainerDied","Data":"cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da"} Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.270096 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgztp" event={"ID":"750aefb6-6ba7-4afb-b623-83ac34d4b5a6","Type":"ContainerDied","Data":"ba61bde51780574574c1484358c87f29dba364de09e95bd491c8b7fcad3da836"} Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.270128 4990 scope.go:117] "RemoveContainer" containerID="cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.270130 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgztp" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.307576 4990 scope.go:117] "RemoveContainer" containerID="6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.332589 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgztp"] Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.340830 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgztp"] Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.344422 4990 scope.go:117] "RemoveContainer" containerID="f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.377377 4990 scope.go:117] "RemoveContainer" containerID="cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da" Oct 03 11:06:44 crc kubenswrapper[4990]: E1003 11:06:44.378010 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da\": container with ID starting with cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da not found: ID does not exist" containerID="cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.378064 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da"} err="failed to get container status \"cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da\": rpc error: code = NotFound desc = could not find container \"cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da\": container with ID starting with cc0fee5255fa9863da0facaaa8102d4ceb4be7cc6fe5c13161ee8930dca1c8da not found: ID does not exist" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.378097 4990 scope.go:117] "RemoveContainer" containerID="6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0" Oct 03 11:06:44 crc kubenswrapper[4990]: E1003 11:06:44.378618 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0\": container with ID starting with 6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0 not found: ID does not exist" containerID="6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.378823 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0"} err="failed to get container status \"6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0\": rpc error: code = NotFound desc = could not find container \"6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0\": container with ID starting with 6a8ad15959ec4618b7aaf9ff377586d2a20fc1b087f9f6d09dec21be133611c0 not found: ID does not exist" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.378878 4990 scope.go:117] "RemoveContainer" containerID="f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461" Oct 03 11:06:44 crc kubenswrapper[4990]: E1003 11:06:44.379414 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461\": container with ID starting with f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461 not found: ID does not exist" containerID="f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.379453 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461"} err="failed to get container status \"f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461\": rpc error: code = NotFound desc = could not find container \"f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461\": container with ID starting with f7b51ef015c11db244b2fedb8dca4535e8cd0edf3b76605220c720c747926461 not found: ID does not exist" Oct 03 11:06:44 crc kubenswrapper[4990]: I1003 11:06:44.886750 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" path="/var/lib/kubelet/pods/750aefb6-6ba7-4afb-b623-83ac34d4b5a6/volumes" Oct 03 11:06:48 crc kubenswrapper[4990]: I1003 11:06:48.662828 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 11:06:48 crc kubenswrapper[4990]: I1003 11:06:48.830761 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.294522 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-669d466995-chhdd"] Oct 03 11:06:54 crc kubenswrapper[4990]: E1003 11:06:54.295399 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerName="extract-utilities" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.295414 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerName="extract-utilities" Oct 03 11:06:54 crc kubenswrapper[4990]: E1003 11:06:54.295448 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerName="registry-server" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.295457 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerName="registry-server" Oct 03 11:06:54 crc kubenswrapper[4990]: E1003 11:06:54.295476 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerName="extract-content" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.295487 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerName="extract-content" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.295698 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="750aefb6-6ba7-4afb-b623-83ac34d4b5a6" containerName="registry-server" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.296632 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.323410 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-669d466995-chhdd"] Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.354211 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnm4r\" (UniqueName: \"kubernetes.io/projected/77da3289-2b52-4bad-aed7-f4509c5ecc6e-kube-api-access-rnm4r\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.354310 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-config\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.354341 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-dns-svc\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.455710 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnm4r\" (UniqueName: \"kubernetes.io/projected/77da3289-2b52-4bad-aed7-f4509c5ecc6e-kube-api-access-rnm4r\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.455838 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-config\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.455875 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-dns-svc\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.457011 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-config\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.457185 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-dns-svc\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.483724 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnm4r\" (UniqueName: \"kubernetes.io/projected/77da3289-2b52-4bad-aed7-f4509c5ecc6e-kube-api-access-rnm4r\") pod \"dnsmasq-dns-669d466995-chhdd\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:54 crc kubenswrapper[4990]: I1003 11:06:54.621983 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.077185 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-669d466995-chhdd"] Oct 03 11:06:55 crc kubenswrapper[4990]: W1003 11:06:55.080458 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77da3289_2b52_4bad_aed7_f4509c5ecc6e.slice/crio-07aa6b22bcf191280296fa0602f05f810f51f18ca350fb14a97ec00acd0461cb WatchSource:0}: Error finding container 07aa6b22bcf191280296fa0602f05f810f51f18ca350fb14a97ec00acd0461cb: Status 404 returned error can't find the container with id 07aa6b22bcf191280296fa0602f05f810f51f18ca350fb14a97ec00acd0461cb Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.123088 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.304084 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.304140 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.304191 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.304758 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.304809 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" gracePeriod=600 Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.372203 4990 generic.go:334] "Generic (PLEG): container finished" podID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" containerID="6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9" exitCode=0 Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.372244 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669d466995-chhdd" event={"ID":"77da3289-2b52-4bad-aed7-f4509c5ecc6e","Type":"ContainerDied","Data":"6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9"} Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.372282 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669d466995-chhdd" event={"ID":"77da3289-2b52-4bad-aed7-f4509c5ecc6e","Type":"ContainerStarted","Data":"07aa6b22bcf191280296fa0602f05f810f51f18ca350fb14a97ec00acd0461cb"} Oct 03 11:06:55 crc kubenswrapper[4990]: E1003 11:06:55.434608 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:06:55 crc kubenswrapper[4990]: I1003 11:06:55.732162 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:06:56 crc kubenswrapper[4990]: I1003 11:06:56.381184 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669d466995-chhdd" event={"ID":"77da3289-2b52-4bad-aed7-f4509c5ecc6e","Type":"ContainerStarted","Data":"438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75"} Oct 03 11:06:56 crc kubenswrapper[4990]: I1003 11:06:56.381398 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:06:56 crc kubenswrapper[4990]: I1003 11:06:56.383727 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" exitCode=0 Oct 03 11:06:56 crc kubenswrapper[4990]: I1003 11:06:56.383762 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7"} Oct 03 11:06:56 crc kubenswrapper[4990]: I1003 11:06:56.383789 4990 scope.go:117] "RemoveContainer" containerID="1e900d340694debff290495e97d4f0b3946a050657962c4ba637a78dc937b8c1" Oct 03 11:06:56 crc kubenswrapper[4990]: I1003 11:06:56.384067 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:06:56 crc kubenswrapper[4990]: E1003 11:06:56.384251 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:06:56 crc kubenswrapper[4990]: I1003 11:06:56.407035 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-669d466995-chhdd" podStartSLOduration=2.407013377 podStartE2EDuration="2.407013377s" podCreationTimestamp="2025-10-03 11:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:06:56.406011951 +0000 UTC m=+4998.202643808" watchObservedRunningTime="2025-10-03 11:06:56.407013377 +0000 UTC m=+4998.203645234" Oct 03 11:06:59 crc kubenswrapper[4990]: I1003 11:06:59.672632 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" containerName="rabbitmq" containerID="cri-o://93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e" gracePeriod=604796 Oct 03 11:06:59 crc kubenswrapper[4990]: I1003 11:06:59.792624 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3afeb5ad-0968-47ee-af7f-fc8506997433" containerName="rabbitmq" containerID="cri-o://cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07" gracePeriod=604796 Oct 03 11:07:04 crc kubenswrapper[4990]: I1003 11:07:04.624797 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:07:04 crc kubenswrapper[4990]: I1003 11:07:04.731872 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b77d44889-psgnq"] Oct 03 11:07:04 crc kubenswrapper[4990]: I1003 11:07:04.732558 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" podUID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" containerName="dnsmasq-dns" containerID="cri-o://8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a" gracePeriod=10 Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.135012 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.219428 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-dns-svc\") pod \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.219556 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-config\") pod \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.219617 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5pzw\" (UniqueName: \"kubernetes.io/projected/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-kube-api-access-n5pzw\") pod \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\" (UID: \"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7\") " Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.234781 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-kube-api-access-n5pzw" (OuterVolumeSpecName: "kube-api-access-n5pzw") pod "e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" (UID: "e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7"). InnerVolumeSpecName "kube-api-access-n5pzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.266247 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-config" (OuterVolumeSpecName: "config") pod "e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" (UID: "e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.298091 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" (UID: "e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.320743 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.320787 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5pzw\" (UniqueName: \"kubernetes.io/projected/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-kube-api-access-n5pzw\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.320804 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.467243 4990 generic.go:334] "Generic (PLEG): container finished" podID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" containerID="8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a" exitCode=0 Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.467307 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" event={"ID":"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7","Type":"ContainerDied","Data":"8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a"} Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.467350 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" event={"ID":"e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7","Type":"ContainerDied","Data":"1742ff18a33f12e7bc1d0fe7cc2b78e5be0da70bc34cebacca1822c59c34edcc"} Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.467383 4990 scope.go:117] "RemoveContainer" containerID="8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.467383 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b77d44889-psgnq" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.499908 4990 scope.go:117] "RemoveContainer" containerID="580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.530794 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b77d44889-psgnq"] Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.541915 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b77d44889-psgnq"] Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.561424 4990 scope.go:117] "RemoveContainer" containerID="8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a" Oct 03 11:07:05 crc kubenswrapper[4990]: E1003 11:07:05.562700 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a\": container with ID starting with 8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a not found: ID does not exist" containerID="8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.562767 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a"} err="failed to get container status \"8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a\": rpc error: code = NotFound desc = could not find container \"8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a\": container with ID starting with 8f60634939b2702e7a2107e30efd25649503e2178b7778929247341cf509cc7a not found: ID does not exist" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.562807 4990 scope.go:117] "RemoveContainer" containerID="580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9" Oct 03 11:07:05 crc kubenswrapper[4990]: E1003 11:07:05.563664 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9\": container with ID starting with 580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9 not found: ID does not exist" containerID="580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9" Oct 03 11:07:05 crc kubenswrapper[4990]: I1003 11:07:05.563732 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9"} err="failed to get container status \"580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9\": rpc error: code = NotFound desc = could not find container \"580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9\": container with ID starting with 580baf1774156b375a3386c143dea6b4c482c8445dc8cd07b799a1449b6375d9 not found: ID does not exist" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.314816 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.388909 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.443992 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-confd\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444403 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-erlang-cookie-secret\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444444 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-pod-info\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444502 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-erlang-cookie\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444573 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-server-conf\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444610 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-plugins-conf\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444690 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-confd\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444752 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-plugins\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444859 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-tls\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444910 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-config-data\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444940 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-config-data\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.444974 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afeb5ad-0968-47ee-af7f-fc8506997433-erlang-cookie-secret\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445005 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-server-conf\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445041 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afeb5ad-0968-47ee-af7f-fc8506997433-pod-info\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445072 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-plugins\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445153 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-erlang-cookie\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445183 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2qck\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-kube-api-access-m2qck\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445361 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445409 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-plugins-conf\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445438 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptwfh\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-kube-api-access-ptwfh\") pod \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\" (UID: \"72eecd3c-f6f7-436c-af04-ef2126ea0c8b\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445626 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.445679 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-tls\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.447497 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.448203 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.448446 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.448961 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.449006 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.451164 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-kube-api-access-ptwfh" (OuterVolumeSpecName: "kube-api-access-ptwfh") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "kube-api-access-ptwfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.451634 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3afeb5ad-0968-47ee-af7f-fc8506997433-pod-info" (OuterVolumeSpecName: "pod-info") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.452265 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afeb5ad-0968-47ee-af7f-fc8506997433-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.452877 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.454111 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.454186 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.455671 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-kube-api-access-m2qck" (OuterVolumeSpecName: "kube-api-access-m2qck") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "kube-api-access-m2qck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.455937 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.461677 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-pod-info" (OuterVolumeSpecName: "pod-info") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.479017 4990 generic.go:334] "Generic (PLEG): container finished" podID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" containerID="93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e" exitCode=0 Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.479160 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.479697 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72eecd3c-f6f7-436c-af04-ef2126ea0c8b","Type":"ContainerDied","Data":"93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e"} Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.479749 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72eecd3c-f6f7-436c-af04-ef2126ea0c8b","Type":"ContainerDied","Data":"1ab4d9a5c7f36ef860bcc29377af6b03eeb7b379389dcef8f77139f32ce19e2f"} Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.480049 4990 scope.go:117] "RemoveContainer" containerID="93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.485919 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98 podName:3afeb5ad-0968-47ee-af7f-fc8506997433 nodeName:}" failed. No retries permitted until 2025-10-03 11:07:06.985887619 +0000 UTC m=+5008.782519476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "persistence" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.486037 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.485934 4990 generic.go:334] "Generic (PLEG): container finished" podID="3afeb5ad-0968-47ee-af7f-fc8506997433" containerID="cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07" exitCode=0 Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.486146 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14" (OuterVolumeSpecName: "persistence") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "pvc-0eac497a-5476-4492-91a7-b0864a078f14". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.486183 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afeb5ad-0968-47ee-af7f-fc8506997433","Type":"ContainerDied","Data":"cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07"} Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.486405 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3afeb5ad-0968-47ee-af7f-fc8506997433","Type":"ContainerDied","Data":"724fee61fa3828bcd01cf199d6b9734b190886b1f5d15e04eee9472ae875c250"} Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.488218 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-config-data" (OuterVolumeSpecName: "config-data") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.488924 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-config-data" (OuterVolumeSpecName: "config-data") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.510367 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-server-conf" (OuterVolumeSpecName: "server-conf") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.512362 4990 scope.go:117] "RemoveContainer" containerID="7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.527145 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-server-conf" (OuterVolumeSpecName: "server-conf") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.528888 4990 scope.go:117] "RemoveContainer" containerID="93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.529906 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e\": container with ID starting with 93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e not found: ID does not exist" containerID="93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.529942 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e"} err="failed to get container status \"93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e\": rpc error: code = NotFound desc = could not find container \"93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e\": container with ID starting with 93deab4f2c66d56304043418bf85a980231e729e64c26befb83253926785280e not found: ID does not exist" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.529968 4990 scope.go:117] "RemoveContainer" containerID="7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.530400 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4\": container with ID starting with 7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4 not found: ID does not exist" containerID="7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.530432 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4"} err="failed to get container status \"7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4\": rpc error: code = NotFound desc = could not find container \"7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4\": container with ID starting with 7af6f3d9356d425cd66be6789d572fd14c5a70ef31261981476cb45b1822bbd4 not found: ID does not exist" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.530452 4990 scope.go:117] "RemoveContainer" containerID="cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548669 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548702 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548715 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548727 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548768 4990 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3afeb5ad-0968-47ee-af7f-fc8506997433-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548781 4990 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548792 4990 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3afeb5ad-0968-47ee-af7f-fc8506997433-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548803 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548815 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548854 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2qck\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-kube-api-access-m2qck\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548893 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") on node \"crc\" " Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548950 4990 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548969 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptwfh\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-kube-api-access-ptwfh\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.550737 4990 scope.go:117] "RemoveContainer" containerID="f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.548985 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.552980 4990 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.553310 4990 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.553329 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.553341 4990 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3afeb5ad-0968-47ee-af7f-fc8506997433-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.553355 4990 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.564727 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.567862 4990 scope.go:117] "RemoveContainer" containerID="cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.568341 4990 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.568496 4990 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0eac497a-5476-4492-91a7-b0864a078f14" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14") on node "crc" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.568560 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07\": container with ID starting with cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07 not found: ID does not exist" containerID="cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.568598 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07"} err="failed to get container status \"cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07\": rpc error: code = NotFound desc = could not find container \"cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07\": container with ID starting with cc9e8efb3851de1f1ec559d314b0cd2756ba17bfd2d16a5572e1176dbeacdd07 not found: ID does not exist" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.568626 4990 scope.go:117] "RemoveContainer" containerID="f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.568916 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0\": container with ID starting with f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0 not found: ID does not exist" containerID="f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.568938 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0"} err="failed to get container status \"f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0\": rpc error: code = NotFound desc = could not find container \"f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0\": container with ID starting with f4af55b812b419d3c5897226ee888b4854aa63dd520ada6fbcf652f708fb72d0 not found: ID does not exist" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.571068 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "72eecd3c-f6f7-436c-af04-ef2126ea0c8b" (UID: "72eecd3c-f6f7-436c-af04-ef2126ea0c8b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.654350 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3afeb5ad-0968-47ee-af7f-fc8506997433-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.654377 4990 reconciler_common.go:293] "Volume detached for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.654389 4990 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72eecd3c-f6f7-436c-af04-ef2126ea0c8b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.820265 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.834429 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.858643 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.858970 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" containerName="rabbitmq" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.858992 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" containerName="rabbitmq" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.859009 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" containerName="init" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.859018 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" containerName="init" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.859033 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afeb5ad-0968-47ee-af7f-fc8506997433" containerName="rabbitmq" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.859041 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afeb5ad-0968-47ee-af7f-fc8506997433" containerName="rabbitmq" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.859062 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" containerName="setup-container" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.859070 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" containerName="setup-container" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.859088 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" containerName="dnsmasq-dns" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.859096 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" containerName="dnsmasq-dns" Oct 03 11:07:06 crc kubenswrapper[4990]: E1003 11:07:06.859111 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afeb5ad-0968-47ee-af7f-fc8506997433" containerName="setup-container" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.859120 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afeb5ad-0968-47ee-af7f-fc8506997433" containerName="setup-container" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.859288 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" containerName="rabbitmq" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.859313 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" containerName="dnsmasq-dns" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.859326 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afeb5ad-0968-47ee-af7f-fc8506997433" containerName="rabbitmq" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.860344 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.862395 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z4z2f" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.862817 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.862841 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.862880 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.863207 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.863245 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.866976 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.903470 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72eecd3c-f6f7-436c-af04-ef2126ea0c8b" path="/var/lib/kubelet/pods/72eecd3c-f6f7-436c-af04-ef2126ea0c8b/volumes" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.904854 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7" path="/var/lib/kubelet/pods/e95e54cd-eaba-4eaf-adba-08a1e2fc0bc7/volumes" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.959421 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960240 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960282 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbxvq\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-kube-api-access-dbxvq\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960312 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960331 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960358 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-config-data\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960484 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960521 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960566 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7cb7a78f-fac0-4afd-872a-edf928061dbd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960596 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7cb7a78f-fac0-4afd-872a-edf928061dbd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960718 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:06 crc kubenswrapper[4990]: I1003 11:07:06.960769 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.061802 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"3afeb5ad-0968-47ee-af7f-fc8506997433\" (UID: \"3afeb5ad-0968-47ee-af7f-fc8506997433\") " Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062173 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062237 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbxvq\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-kube-api-access-dbxvq\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062290 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062326 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062357 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-config-data\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062386 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062430 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062496 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7cb7a78f-fac0-4afd-872a-edf928061dbd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062582 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7cb7a78f-fac0-4afd-872a-edf928061dbd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062637 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.062668 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.063462 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-config-data\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.063972 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.064364 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.064430 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.064635 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7cb7a78f-fac0-4afd-872a-edf928061dbd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.066636 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.066761 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d1a63a80764637e6840e48d066d143b6c2316b148a10b0e73149fc27a3797c67/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.068513 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7cb7a78f-fac0-4afd-872a-edf928061dbd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.068844 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7cb7a78f-fac0-4afd-872a-edf928061dbd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.070318 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.080283 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.080680 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98" (OuterVolumeSpecName: "persistence") pod "3afeb5ad-0968-47ee-af7f-fc8506997433" (UID: "3afeb5ad-0968-47ee-af7f-fc8506997433"). InnerVolumeSpecName "pvc-dda4d984-f8ec-442b-b821-034d49a1cd98". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.091779 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbxvq\" (UniqueName: \"kubernetes.io/projected/7cb7a78f-fac0-4afd-872a-edf928061dbd-kube-api-access-dbxvq\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.101122 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0eac497a-5476-4492-91a7-b0864a078f14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0eac497a-5476-4492-91a7-b0864a078f14\") pod \"rabbitmq-server-0\" (UID: \"7cb7a78f-fac0-4afd-872a-edf928061dbd\") " pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.166523 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") on node \"crc\" " Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.173762 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.178932 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.182217 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.185813 4990 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.186078 4990 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-dda4d984-f8ec-442b-b821-034d49a1cd98" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98") on node "crc" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.204371 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.206809 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.208369 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-v95t6" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.208842 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.209406 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.209655 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.209677 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.210481 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.210796 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.271979 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272039 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00f14153-c741-40c4-8cc9-7535f429d860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272063 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272094 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00f14153-c741-40c4-8cc9-7535f429d860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272127 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272148 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272187 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272277 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bxkn\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-kube-api-access-6bxkn\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272335 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272375 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.272401 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.273794 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.274279 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.274325 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/48c415e463b64b7f7cdde39f2ccb21f532e0762692d5d7249f65ab51659afcd4/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.334030 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dda4d984-f8ec-442b-b821-034d49a1cd98\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.374591 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.374645 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.374667 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.374718 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.375129 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.375178 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.375242 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00f14153-c741-40c4-8cc9-7535f429d860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.375270 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.375319 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00f14153-c741-40c4-8cc9-7535f429d860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.375691 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.375730 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.375798 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bxkn\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-kube-api-access-6bxkn\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.376115 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.377401 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.378911 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00f14153-c741-40c4-8cc9-7535f429d860-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.380798 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00f14153-c741-40c4-8cc9-7535f429d860-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.380889 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.380901 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.381149 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00f14153-c741-40c4-8cc9-7535f429d860-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.391403 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bxkn\" (UniqueName: \"kubernetes.io/projected/00f14153-c741-40c4-8cc9-7535f429d860-kube-api-access-6bxkn\") pod \"rabbitmq-cell1-server-0\" (UID: \"00f14153-c741-40c4-8cc9-7535f429d860\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.583746 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.652584 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 11:07:07 crc kubenswrapper[4990]: W1003 11:07:07.666810 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cb7a78f_fac0_4afd_872a_edf928061dbd.slice/crio-6097591ea2dcb8c5f4b87f3e45982d325217efa4b252a6d5e8a2654f8928b056 WatchSource:0}: Error finding container 6097591ea2dcb8c5f4b87f3e45982d325217efa4b252a6d5e8a2654f8928b056: Status 404 returned error can't find the container with id 6097591ea2dcb8c5f4b87f3e45982d325217efa4b252a6d5e8a2654f8928b056 Oct 03 11:07:07 crc kubenswrapper[4990]: I1003 11:07:07.853942 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 11:07:07 crc kubenswrapper[4990]: W1003 11:07:07.864076 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00f14153_c741_40c4_8cc9_7535f429d860.slice/crio-6c7847177c5adc424e227f79be21e2007f0c38009a7f10cf0b0ea0264f4ed333 WatchSource:0}: Error finding container 6c7847177c5adc424e227f79be21e2007f0c38009a7f10cf0b0ea0264f4ed333: Status 404 returned error can't find the container with id 6c7847177c5adc424e227f79be21e2007f0c38009a7f10cf0b0ea0264f4ed333 Oct 03 11:07:08 crc kubenswrapper[4990]: I1003 11:07:08.503621 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00f14153-c741-40c4-8cc9-7535f429d860","Type":"ContainerStarted","Data":"6c7847177c5adc424e227f79be21e2007f0c38009a7f10cf0b0ea0264f4ed333"} Oct 03 11:07:08 crc kubenswrapper[4990]: I1003 11:07:08.507556 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7cb7a78f-fac0-4afd-872a-edf928061dbd","Type":"ContainerStarted","Data":"6097591ea2dcb8c5f4b87f3e45982d325217efa4b252a6d5e8a2654f8928b056"} Oct 03 11:07:08 crc kubenswrapper[4990]: I1003 11:07:08.890879 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afeb5ad-0968-47ee-af7f-fc8506997433" path="/var/lib/kubelet/pods/3afeb5ad-0968-47ee-af7f-fc8506997433/volumes" Oct 03 11:07:09 crc kubenswrapper[4990]: I1003 11:07:09.520076 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00f14153-c741-40c4-8cc9-7535f429d860","Type":"ContainerStarted","Data":"b6e571ac009fc16411d8b23c9c62f219c049046c319484df995733c4fbbf1fa5"} Oct 03 11:07:10 crc kubenswrapper[4990]: I1003 11:07:10.543088 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7cb7a78f-fac0-4afd-872a-edf928061dbd","Type":"ContainerStarted","Data":"2acfcdbc02020e828ae247df028bbaf87eaae333da5d9364036e9746c33882ac"} Oct 03 11:07:10 crc kubenswrapper[4990]: I1003 11:07:10.872480 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:07:10 crc kubenswrapper[4990]: E1003 11:07:10.872893 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:07:23 crc kubenswrapper[4990]: I1003 11:07:23.872076 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:07:23 crc kubenswrapper[4990]: E1003 11:07:23.873394 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:07:36 crc kubenswrapper[4990]: I1003 11:07:36.872417 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:07:36 crc kubenswrapper[4990]: E1003 11:07:36.873259 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:07:42 crc kubenswrapper[4990]: I1003 11:07:42.857682 4990 generic.go:334] "Generic (PLEG): container finished" podID="00f14153-c741-40c4-8cc9-7535f429d860" containerID="b6e571ac009fc16411d8b23c9c62f219c049046c319484df995733c4fbbf1fa5" exitCode=0 Oct 03 11:07:42 crc kubenswrapper[4990]: I1003 11:07:42.858309 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00f14153-c741-40c4-8cc9-7535f429d860","Type":"ContainerDied","Data":"b6e571ac009fc16411d8b23c9c62f219c049046c319484df995733c4fbbf1fa5"} Oct 03 11:07:43 crc kubenswrapper[4990]: I1003 11:07:43.869021 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00f14153-c741-40c4-8cc9-7535f429d860","Type":"ContainerStarted","Data":"dd09c80b163fb3d2c60cab2af7de9a71753a5ceceeefd06b962f480a22e64f9e"} Oct 03 11:07:43 crc kubenswrapper[4990]: I1003 11:07:43.869683 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:07:43 crc kubenswrapper[4990]: I1003 11:07:43.871216 4990 generic.go:334] "Generic (PLEG): container finished" podID="7cb7a78f-fac0-4afd-872a-edf928061dbd" containerID="2acfcdbc02020e828ae247df028bbaf87eaae333da5d9364036e9746c33882ac" exitCode=0 Oct 03 11:07:43 crc kubenswrapper[4990]: I1003 11:07:43.871260 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7cb7a78f-fac0-4afd-872a-edf928061dbd","Type":"ContainerDied","Data":"2acfcdbc02020e828ae247df028bbaf87eaae333da5d9364036e9746c33882ac"} Oct 03 11:07:43 crc kubenswrapper[4990]: I1003 11:07:43.927908 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.927883767 podStartE2EDuration="36.927883767s" podCreationTimestamp="2025-10-03 11:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:07:43.908631874 +0000 UTC m=+5045.705263791" watchObservedRunningTime="2025-10-03 11:07:43.927883767 +0000 UTC m=+5045.724515634" Oct 03 11:07:44 crc kubenswrapper[4990]: I1003 11:07:44.883699 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7cb7a78f-fac0-4afd-872a-edf928061dbd","Type":"ContainerStarted","Data":"0b6bc0f90fdcadd3eb64dd42e63856666a6852d306d40efd4d2438de4cdfa6e9"} Oct 03 11:07:44 crc kubenswrapper[4990]: I1003 11:07:44.884406 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 11:07:44 crc kubenswrapper[4990]: I1003 11:07:44.921733 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.921709545 podStartE2EDuration="38.921709545s" podCreationTimestamp="2025-10-03 11:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:07:44.915790513 +0000 UTC m=+5046.712422410" watchObservedRunningTime="2025-10-03 11:07:44.921709545 +0000 UTC m=+5046.718341412" Oct 03 11:07:49 crc kubenswrapper[4990]: I1003 11:07:49.872684 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:07:49 crc kubenswrapper[4990]: E1003 11:07:49.873722 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:07:57 crc kubenswrapper[4990]: I1003 11:07:57.186681 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 11:07:57 crc kubenswrapper[4990]: I1003 11:07:57.586717 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 11:08:00 crc kubenswrapper[4990]: I1003 11:08:00.475251 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 11:08:00 crc kubenswrapper[4990]: I1003 11:08:00.476395 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 11:08:00 crc kubenswrapper[4990]: I1003 11:08:00.479552 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bfg69" Oct 03 11:08:00 crc kubenswrapper[4990]: I1003 11:08:00.488098 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 11:08:00 crc kubenswrapper[4990]: I1003 11:08:00.550639 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44tv4\" (UniqueName: \"kubernetes.io/projected/737bf4f9-970e-4d0f-a6d0-f542abd6c1b6-kube-api-access-44tv4\") pod \"mariadb-client-1-default\" (UID: \"737bf4f9-970e-4d0f-a6d0-f542abd6c1b6\") " pod="openstack/mariadb-client-1-default" Oct 03 11:08:00 crc kubenswrapper[4990]: I1003 11:08:00.652690 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44tv4\" (UniqueName: \"kubernetes.io/projected/737bf4f9-970e-4d0f-a6d0-f542abd6c1b6-kube-api-access-44tv4\") pod \"mariadb-client-1-default\" (UID: \"737bf4f9-970e-4d0f-a6d0-f542abd6c1b6\") " pod="openstack/mariadb-client-1-default" Oct 03 11:08:00 crc kubenswrapper[4990]: I1003 11:08:00.687463 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44tv4\" (UniqueName: \"kubernetes.io/projected/737bf4f9-970e-4d0f-a6d0-f542abd6c1b6-kube-api-access-44tv4\") pod \"mariadb-client-1-default\" (UID: \"737bf4f9-970e-4d0f-a6d0-f542abd6c1b6\") " pod="openstack/mariadb-client-1-default" Oct 03 11:08:00 crc kubenswrapper[4990]: I1003 11:08:00.800978 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 11:08:01 crc kubenswrapper[4990]: I1003 11:08:01.330349 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 11:08:01 crc kubenswrapper[4990]: W1003 11:08:01.339859 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737bf4f9_970e_4d0f_a6d0_f542abd6c1b6.slice/crio-cfe30aee9b0bd9eaa83ed47c1db8fb2f08713a43ed49de700e7e27b4bbce33f5 WatchSource:0}: Error finding container cfe30aee9b0bd9eaa83ed47c1db8fb2f08713a43ed49de700e7e27b4bbce33f5: Status 404 returned error can't find the container with id cfe30aee9b0bd9eaa83ed47c1db8fb2f08713a43ed49de700e7e27b4bbce33f5 Oct 03 11:08:02 crc kubenswrapper[4990]: I1003 11:08:02.024956 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"737bf4f9-970e-4d0f-a6d0-f542abd6c1b6","Type":"ContainerStarted","Data":"cfe30aee9b0bd9eaa83ed47c1db8fb2f08713a43ed49de700e7e27b4bbce33f5"} Oct 03 11:08:02 crc kubenswrapper[4990]: I1003 11:08:02.872388 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:08:02 crc kubenswrapper[4990]: E1003 11:08:02.872981 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:08:03 crc kubenswrapper[4990]: I1003 11:08:03.033311 4990 generic.go:334] "Generic (PLEG): container finished" podID="737bf4f9-970e-4d0f-a6d0-f542abd6c1b6" containerID="430741ae909064696380996dc42c48191982c1bff98e4624b5dc37060878462d" exitCode=0 Oct 03 11:08:03 crc kubenswrapper[4990]: I1003 11:08:03.033359 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"737bf4f9-970e-4d0f-a6d0-f542abd6c1b6","Type":"ContainerDied","Data":"430741ae909064696380996dc42c48191982c1bff98e4624b5dc37060878462d"} Oct 03 11:08:04 crc kubenswrapper[4990]: I1003 11:08:04.477681 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 11:08:04 crc kubenswrapper[4990]: I1003 11:08:04.510730 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44tv4\" (UniqueName: \"kubernetes.io/projected/737bf4f9-970e-4d0f-a6d0-f542abd6c1b6-kube-api-access-44tv4\") pod \"737bf4f9-970e-4d0f-a6d0-f542abd6c1b6\" (UID: \"737bf4f9-970e-4d0f-a6d0-f542abd6c1b6\") " Oct 03 11:08:04 crc kubenswrapper[4990]: I1003 11:08:04.512351 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_737bf4f9-970e-4d0f-a6d0-f542abd6c1b6/mariadb-client-1-default/0.log" Oct 03 11:08:04 crc kubenswrapper[4990]: I1003 11:08:04.519771 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737bf4f9-970e-4d0f-a6d0-f542abd6c1b6-kube-api-access-44tv4" (OuterVolumeSpecName: "kube-api-access-44tv4") pod "737bf4f9-970e-4d0f-a6d0-f542abd6c1b6" (UID: "737bf4f9-970e-4d0f-a6d0-f542abd6c1b6"). InnerVolumeSpecName "kube-api-access-44tv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:08:04 crc kubenswrapper[4990]: I1003 11:08:04.542249 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 11:08:04 crc kubenswrapper[4990]: I1003 11:08:04.550226 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 11:08:04 crc kubenswrapper[4990]: I1003 11:08:04.612742 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44tv4\" (UniqueName: \"kubernetes.io/projected/737bf4f9-970e-4d0f-a6d0-f542abd6c1b6-kube-api-access-44tv4\") on node \"crc\" DevicePath \"\"" Oct 03 11:08:04 crc kubenswrapper[4990]: I1003 11:08:04.881159 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737bf4f9-970e-4d0f-a6d0-f542abd6c1b6" path="/var/lib/kubelet/pods/737bf4f9-970e-4d0f-a6d0-f542abd6c1b6/volumes" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.053665 4990 scope.go:117] "RemoveContainer" containerID="430741ae909064696380996dc42c48191982c1bff98e4624b5dc37060878462d" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.053745 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.055972 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 11:08:05 crc kubenswrapper[4990]: E1003 11:08:05.056365 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737bf4f9-970e-4d0f-a6d0-f542abd6c1b6" containerName="mariadb-client-1-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.056387 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="737bf4f9-970e-4d0f-a6d0-f542abd6c1b6" containerName="mariadb-client-1-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.056586 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="737bf4f9-970e-4d0f-a6d0-f542abd6c1b6" containerName="mariadb-client-1-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.057077 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.059280 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bfg69" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.068773 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.120241 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9flr\" (UniqueName: \"kubernetes.io/projected/19d5e904-0d33-4dd3-8ae4-ba2a2669b581-kube-api-access-w9flr\") pod \"mariadb-client-2-default\" (UID: \"19d5e904-0d33-4dd3-8ae4-ba2a2669b581\") " pod="openstack/mariadb-client-2-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.222208 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9flr\" (UniqueName: \"kubernetes.io/projected/19d5e904-0d33-4dd3-8ae4-ba2a2669b581-kube-api-access-w9flr\") pod \"mariadb-client-2-default\" (UID: \"19d5e904-0d33-4dd3-8ae4-ba2a2669b581\") " pod="openstack/mariadb-client-2-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.242221 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9flr\" (UniqueName: \"kubernetes.io/projected/19d5e904-0d33-4dd3-8ae4-ba2a2669b581-kube-api-access-w9flr\") pod \"mariadb-client-2-default\" (UID: \"19d5e904-0d33-4dd3-8ae4-ba2a2669b581\") " pod="openstack/mariadb-client-2-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.426389 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 11:08:05 crc kubenswrapper[4990]: I1003 11:08:05.776683 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 11:08:06 crc kubenswrapper[4990]: I1003 11:08:06.064262 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"19d5e904-0d33-4dd3-8ae4-ba2a2669b581","Type":"ContainerStarted","Data":"46c894583e1a789705fa5b2772c4e1f70ecb5e7956b506b97477e0da7ab38719"} Oct 03 11:08:06 crc kubenswrapper[4990]: I1003 11:08:06.064328 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"19d5e904-0d33-4dd3-8ae4-ba2a2669b581","Type":"ContainerStarted","Data":"47027279a58b4f5f4ac9e386a8b6b95cb44b089e43fada0f39adaa1c0ab77c11"} Oct 03 11:08:06 crc kubenswrapper[4990]: I1003 11:08:06.085649 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.085624752 podStartE2EDuration="1.085624752s" podCreationTimestamp="2025-10-03 11:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:08:06.079915326 +0000 UTC m=+5067.876547193" watchObservedRunningTime="2025-10-03 11:08:06.085624752 +0000 UTC m=+5067.882256619" Oct 03 11:08:07 crc kubenswrapper[4990]: I1003 11:08:07.074749 4990 generic.go:334] "Generic (PLEG): container finished" podID="19d5e904-0d33-4dd3-8ae4-ba2a2669b581" containerID="46c894583e1a789705fa5b2772c4e1f70ecb5e7956b506b97477e0da7ab38719" exitCode=0 Oct 03 11:08:07 crc kubenswrapper[4990]: I1003 11:08:07.074814 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"19d5e904-0d33-4dd3-8ae4-ba2a2669b581","Type":"ContainerDied","Data":"46c894583e1a789705fa5b2772c4e1f70ecb5e7956b506b97477e0da7ab38719"} Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.460346 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.488072 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9flr\" (UniqueName: \"kubernetes.io/projected/19d5e904-0d33-4dd3-8ae4-ba2a2669b581-kube-api-access-w9flr\") pod \"19d5e904-0d33-4dd3-8ae4-ba2a2669b581\" (UID: \"19d5e904-0d33-4dd3-8ae4-ba2a2669b581\") " Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.497911 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d5e904-0d33-4dd3-8ae4-ba2a2669b581-kube-api-access-w9flr" (OuterVolumeSpecName: "kube-api-access-w9flr") pod "19d5e904-0d33-4dd3-8ae4-ba2a2669b581" (UID: "19d5e904-0d33-4dd3-8ae4-ba2a2669b581"). InnerVolumeSpecName "kube-api-access-w9flr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.504971 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.515914 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.590319 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9flr\" (UniqueName: \"kubernetes.io/projected/19d5e904-0d33-4dd3-8ae4-ba2a2669b581-kube-api-access-w9flr\") on node \"crc\" DevicePath \"\"" Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.886794 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d5e904-0d33-4dd3-8ae4-ba2a2669b581" path="/var/lib/kubelet/pods/19d5e904-0d33-4dd3-8ae4-ba2a2669b581/volumes" Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.996196 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 03 11:08:08 crc kubenswrapper[4990]: E1003 11:08:08.996726 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d5e904-0d33-4dd3-8ae4-ba2a2669b581" containerName="mariadb-client-2-default" Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.996755 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d5e904-0d33-4dd3-8ae4-ba2a2669b581" containerName="mariadb-client-2-default" Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.997101 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d5e904-0d33-4dd3-8ae4-ba2a2669b581" containerName="mariadb-client-2-default" Oct 03 11:08:08 crc kubenswrapper[4990]: I1003 11:08:08.997991 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 11:08:09 crc kubenswrapper[4990]: I1003 11:08:09.004083 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 11:08:09 crc kubenswrapper[4990]: I1003 11:08:09.090414 4990 scope.go:117] "RemoveContainer" containerID="46c894583e1a789705fa5b2772c4e1f70ecb5e7956b506b97477e0da7ab38719" Oct 03 11:08:09 crc kubenswrapper[4990]: I1003 11:08:09.090459 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 11:08:09 crc kubenswrapper[4990]: I1003 11:08:09.100125 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njv5\" (UniqueName: \"kubernetes.io/projected/2177bd63-02e2-4981-948e-5d137156b074-kube-api-access-5njv5\") pod \"mariadb-client-1\" (UID: \"2177bd63-02e2-4981-948e-5d137156b074\") " pod="openstack/mariadb-client-1" Oct 03 11:08:09 crc kubenswrapper[4990]: I1003 11:08:09.201231 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njv5\" (UniqueName: \"kubernetes.io/projected/2177bd63-02e2-4981-948e-5d137156b074-kube-api-access-5njv5\") pod \"mariadb-client-1\" (UID: \"2177bd63-02e2-4981-948e-5d137156b074\") " pod="openstack/mariadb-client-1" Oct 03 11:08:09 crc kubenswrapper[4990]: I1003 11:08:09.219832 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njv5\" (UniqueName: \"kubernetes.io/projected/2177bd63-02e2-4981-948e-5d137156b074-kube-api-access-5njv5\") pod \"mariadb-client-1\" (UID: \"2177bd63-02e2-4981-948e-5d137156b074\") " pod="openstack/mariadb-client-1" Oct 03 11:08:09 crc kubenswrapper[4990]: I1003 11:08:09.327852 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 11:08:09 crc kubenswrapper[4990]: I1003 11:08:09.825294 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 11:08:10 crc kubenswrapper[4990]: I1003 11:08:10.100899 4990 generic.go:334] "Generic (PLEG): container finished" podID="2177bd63-02e2-4981-948e-5d137156b074" containerID="f78d79c994dda2476a8980fa680ff8b50a8dc2c46cb238f932c7a586c373d319" exitCode=0 Oct 03 11:08:10 crc kubenswrapper[4990]: I1003 11:08:10.101112 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2177bd63-02e2-4981-948e-5d137156b074","Type":"ContainerDied","Data":"f78d79c994dda2476a8980fa680ff8b50a8dc2c46cb238f932c7a586c373d319"} Oct 03 11:08:10 crc kubenswrapper[4990]: I1003 11:08:10.101258 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2177bd63-02e2-4981-948e-5d137156b074","Type":"ContainerStarted","Data":"02c99efe7bb6a8e26610e08355752f1364e461ff2364c9db12368205a0655398"} Oct 03 11:08:11 crc kubenswrapper[4990]: I1003 11:08:11.490393 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 11:08:11 crc kubenswrapper[4990]: I1003 11:08:11.511306 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_2177bd63-02e2-4981-948e-5d137156b074/mariadb-client-1/0.log" Oct 03 11:08:11 crc kubenswrapper[4990]: I1003 11:08:11.540527 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 11:08:11 crc kubenswrapper[4990]: I1003 11:08:11.545037 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 11:08:11 crc kubenswrapper[4990]: I1003 11:08:11.546960 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5njv5\" (UniqueName: \"kubernetes.io/projected/2177bd63-02e2-4981-948e-5d137156b074-kube-api-access-5njv5\") pod \"2177bd63-02e2-4981-948e-5d137156b074\" (UID: \"2177bd63-02e2-4981-948e-5d137156b074\") " Oct 03 11:08:11 crc kubenswrapper[4990]: I1003 11:08:11.554425 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2177bd63-02e2-4981-948e-5d137156b074-kube-api-access-5njv5" (OuterVolumeSpecName: "kube-api-access-5njv5") pod "2177bd63-02e2-4981-948e-5d137156b074" (UID: "2177bd63-02e2-4981-948e-5d137156b074"). InnerVolumeSpecName "kube-api-access-5njv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:08:11 crc kubenswrapper[4990]: I1003 11:08:11.648439 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5njv5\" (UniqueName: \"kubernetes.io/projected/2177bd63-02e2-4981-948e-5d137156b074-kube-api-access-5njv5\") on node \"crc\" DevicePath \"\"" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.033217 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 11:08:12 crc kubenswrapper[4990]: E1003 11:08:12.034236 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2177bd63-02e2-4981-948e-5d137156b074" containerName="mariadb-client-1" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.034273 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2177bd63-02e2-4981-948e-5d137156b074" containerName="mariadb-client-1" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.034575 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2177bd63-02e2-4981-948e-5d137156b074" containerName="mariadb-client-1" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.035188 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.041274 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.120604 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02c99efe7bb6a8e26610e08355752f1364e461ff2364c9db12368205a0655398" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.120681 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.158617 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqbgl\" (UniqueName: \"kubernetes.io/projected/e6b2d158-0904-4506-9743-855132c12850-kube-api-access-vqbgl\") pod \"mariadb-client-4-default\" (UID: \"e6b2d158-0904-4506-9743-855132c12850\") " pod="openstack/mariadb-client-4-default" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.260966 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqbgl\" (UniqueName: \"kubernetes.io/projected/e6b2d158-0904-4506-9743-855132c12850-kube-api-access-vqbgl\") pod \"mariadb-client-4-default\" (UID: \"e6b2d158-0904-4506-9743-855132c12850\") " pod="openstack/mariadb-client-4-default" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.281297 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqbgl\" (UniqueName: \"kubernetes.io/projected/e6b2d158-0904-4506-9743-855132c12850-kube-api-access-vqbgl\") pod \"mariadb-client-4-default\" (UID: \"e6b2d158-0904-4506-9743-855132c12850\") " pod="openstack/mariadb-client-4-default" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.355726 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.673306 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 11:08:12 crc kubenswrapper[4990]: I1003 11:08:12.883241 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2177bd63-02e2-4981-948e-5d137156b074" path="/var/lib/kubelet/pods/2177bd63-02e2-4981-948e-5d137156b074/volumes" Oct 03 11:08:13 crc kubenswrapper[4990]: I1003 11:08:13.131196 4990 generic.go:334] "Generic (PLEG): container finished" podID="e6b2d158-0904-4506-9743-855132c12850" containerID="63bfe6a9399a20d68c8ea5ca97edb95c003f6beba5476ddef979d2642638b3fd" exitCode=0 Oct 03 11:08:13 crc kubenswrapper[4990]: I1003 11:08:13.131406 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"e6b2d158-0904-4506-9743-855132c12850","Type":"ContainerDied","Data":"63bfe6a9399a20d68c8ea5ca97edb95c003f6beba5476ddef979d2642638b3fd"} Oct 03 11:08:13 crc kubenswrapper[4990]: I1003 11:08:13.131491 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"e6b2d158-0904-4506-9743-855132c12850","Type":"ContainerStarted","Data":"e00fb556de9b7adace7b8f380563c3d753b6e8e830b30d57881aab3d0d960f56"} Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.549973 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.573773 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_e6b2d158-0904-4506-9743-855132c12850/mariadb-client-4-default/0.log" Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.616113 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.623173 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.700926 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqbgl\" (UniqueName: \"kubernetes.io/projected/e6b2d158-0904-4506-9743-855132c12850-kube-api-access-vqbgl\") pod \"e6b2d158-0904-4506-9743-855132c12850\" (UID: \"e6b2d158-0904-4506-9743-855132c12850\") " Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.706598 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b2d158-0904-4506-9743-855132c12850-kube-api-access-vqbgl" (OuterVolumeSpecName: "kube-api-access-vqbgl") pod "e6b2d158-0904-4506-9743-855132c12850" (UID: "e6b2d158-0904-4506-9743-855132c12850"). InnerVolumeSpecName "kube-api-access-vqbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.803364 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqbgl\" (UniqueName: \"kubernetes.io/projected/e6b2d158-0904-4506-9743-855132c12850-kube-api-access-vqbgl\") on node \"crc\" DevicePath \"\"" Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.872067 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:08:14 crc kubenswrapper[4990]: E1003 11:08:14.872361 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:08:14 crc kubenswrapper[4990]: I1003 11:08:14.887084 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b2d158-0904-4506-9743-855132c12850" path="/var/lib/kubelet/pods/e6b2d158-0904-4506-9743-855132c12850/volumes" Oct 03 11:08:15 crc kubenswrapper[4990]: I1003 11:08:15.149208 4990 scope.go:117] "RemoveContainer" containerID="63bfe6a9399a20d68c8ea5ca97edb95c003f6beba5476ddef979d2642638b3fd" Oct 03 11:08:15 crc kubenswrapper[4990]: I1003 11:08:15.149243 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 11:08:18 crc kubenswrapper[4990]: I1003 11:08:18.789577 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 11:08:18 crc kubenswrapper[4990]: E1003 11:08:18.790915 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b2d158-0904-4506-9743-855132c12850" containerName="mariadb-client-4-default" Oct 03 11:08:18 crc kubenswrapper[4990]: I1003 11:08:18.790950 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b2d158-0904-4506-9743-855132c12850" containerName="mariadb-client-4-default" Oct 03 11:08:18 crc kubenswrapper[4990]: I1003 11:08:18.791326 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b2d158-0904-4506-9743-855132c12850" containerName="mariadb-client-4-default" Oct 03 11:08:18 crc kubenswrapper[4990]: I1003 11:08:18.792676 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 11:08:18 crc kubenswrapper[4990]: I1003 11:08:18.797129 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bfg69" Oct 03 11:08:18 crc kubenswrapper[4990]: I1003 11:08:18.808844 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 11:08:18 crc kubenswrapper[4990]: I1003 11:08:18.874917 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ssql\" (UniqueName: \"kubernetes.io/projected/c5f8c8cf-21ed-4efc-8481-57ffe936bd73-kube-api-access-6ssql\") pod \"mariadb-client-5-default\" (UID: \"c5f8c8cf-21ed-4efc-8481-57ffe936bd73\") " pod="openstack/mariadb-client-5-default" Oct 03 11:08:18 crc kubenswrapper[4990]: I1003 11:08:18.976583 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ssql\" (UniqueName: \"kubernetes.io/projected/c5f8c8cf-21ed-4efc-8481-57ffe936bd73-kube-api-access-6ssql\") pod \"mariadb-client-5-default\" (UID: \"c5f8c8cf-21ed-4efc-8481-57ffe936bd73\") " pod="openstack/mariadb-client-5-default" Oct 03 11:08:19 crc kubenswrapper[4990]: I1003 11:08:19.004629 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ssql\" (UniqueName: \"kubernetes.io/projected/c5f8c8cf-21ed-4efc-8481-57ffe936bd73-kube-api-access-6ssql\") pod \"mariadb-client-5-default\" (UID: \"c5f8c8cf-21ed-4efc-8481-57ffe936bd73\") " pod="openstack/mariadb-client-5-default" Oct 03 11:08:19 crc kubenswrapper[4990]: I1003 11:08:19.125896 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 11:08:19 crc kubenswrapper[4990]: I1003 11:08:19.445560 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 11:08:20 crc kubenswrapper[4990]: I1003 11:08:20.202128 4990 generic.go:334] "Generic (PLEG): container finished" podID="c5f8c8cf-21ed-4efc-8481-57ffe936bd73" containerID="d362c750436ca42753dff65b5c57c1595909ad8e2a00546631fd1dc7abf93161" exitCode=0 Oct 03 11:08:20 crc kubenswrapper[4990]: I1003 11:08:20.202399 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"c5f8c8cf-21ed-4efc-8481-57ffe936bd73","Type":"ContainerDied","Data":"d362c750436ca42753dff65b5c57c1595909ad8e2a00546631fd1dc7abf93161"} Oct 03 11:08:20 crc kubenswrapper[4990]: I1003 11:08:20.202636 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"c5f8c8cf-21ed-4efc-8481-57ffe936bd73","Type":"ContainerStarted","Data":"1a678ef42cb509a375a849604d9d52aba1a3584d3a365701471a54f92d2f73cb"} Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.698647 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.718538 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_c5f8c8cf-21ed-4efc-8481-57ffe936bd73/mariadb-client-5-default/0.log" Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.742866 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.752606 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.826908 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ssql\" (UniqueName: \"kubernetes.io/projected/c5f8c8cf-21ed-4efc-8481-57ffe936bd73-kube-api-access-6ssql\") pod \"c5f8c8cf-21ed-4efc-8481-57ffe936bd73\" (UID: \"c5f8c8cf-21ed-4efc-8481-57ffe936bd73\") " Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.835738 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f8c8cf-21ed-4efc-8481-57ffe936bd73-kube-api-access-6ssql" (OuterVolumeSpecName: "kube-api-access-6ssql") pod "c5f8c8cf-21ed-4efc-8481-57ffe936bd73" (UID: "c5f8c8cf-21ed-4efc-8481-57ffe936bd73"). InnerVolumeSpecName "kube-api-access-6ssql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.929418 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ssql\" (UniqueName: \"kubernetes.io/projected/c5f8c8cf-21ed-4efc-8481-57ffe936bd73-kube-api-access-6ssql\") on node \"crc\" DevicePath \"\"" Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.933324 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 11:08:21 crc kubenswrapper[4990]: E1003 11:08:21.933761 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f8c8cf-21ed-4efc-8481-57ffe936bd73" containerName="mariadb-client-5-default" Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.933787 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f8c8cf-21ed-4efc-8481-57ffe936bd73" containerName="mariadb-client-5-default" Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.933985 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f8c8cf-21ed-4efc-8481-57ffe936bd73" containerName="mariadb-client-5-default" Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.934814 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 11:08:21 crc kubenswrapper[4990]: I1003 11:08:21.948830 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 11:08:22 crc kubenswrapper[4990]: I1003 11:08:22.031659 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr765\" (UniqueName: \"kubernetes.io/projected/d17c4ea8-6680-421f-ade2-15a31e47fcf9-kube-api-access-lr765\") pod \"mariadb-client-6-default\" (UID: \"d17c4ea8-6680-421f-ade2-15a31e47fcf9\") " pod="openstack/mariadb-client-6-default" Oct 03 11:08:22 crc kubenswrapper[4990]: I1003 11:08:22.133436 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr765\" (UniqueName: \"kubernetes.io/projected/d17c4ea8-6680-421f-ade2-15a31e47fcf9-kube-api-access-lr765\") pod \"mariadb-client-6-default\" (UID: \"d17c4ea8-6680-421f-ade2-15a31e47fcf9\") " pod="openstack/mariadb-client-6-default" Oct 03 11:08:22 crc kubenswrapper[4990]: I1003 11:08:22.156136 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr765\" (UniqueName: \"kubernetes.io/projected/d17c4ea8-6680-421f-ade2-15a31e47fcf9-kube-api-access-lr765\") pod \"mariadb-client-6-default\" (UID: \"d17c4ea8-6680-421f-ade2-15a31e47fcf9\") " pod="openstack/mariadb-client-6-default" Oct 03 11:08:22 crc kubenswrapper[4990]: I1003 11:08:22.220380 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a678ef42cb509a375a849604d9d52aba1a3584d3a365701471a54f92d2f73cb" Oct 03 11:08:22 crc kubenswrapper[4990]: I1003 11:08:22.220452 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 11:08:22 crc kubenswrapper[4990]: I1003 11:08:22.258270 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 11:08:22 crc kubenswrapper[4990]: I1003 11:08:22.860071 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 11:08:22 crc kubenswrapper[4990]: I1003 11:08:22.890930 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f8c8cf-21ed-4efc-8481-57ffe936bd73" path="/var/lib/kubelet/pods/c5f8c8cf-21ed-4efc-8481-57ffe936bd73/volumes" Oct 03 11:08:23 crc kubenswrapper[4990]: I1003 11:08:23.228664 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d17c4ea8-6680-421f-ade2-15a31e47fcf9","Type":"ContainerStarted","Data":"8ec4a268ed3978f68bf390d5b10546e128d801c4fe6383a17f5e33a8f41d7640"} Oct 03 11:08:23 crc kubenswrapper[4990]: I1003 11:08:23.228712 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d17c4ea8-6680-421f-ade2-15a31e47fcf9","Type":"ContainerStarted","Data":"62f99ffbcc462f0815cfd3ce6e984616ded051435045ef16eee25679ec2ed10f"} Oct 03 11:08:23 crc kubenswrapper[4990]: I1003 11:08:23.243221 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=2.243193842 podStartE2EDuration="2.243193842s" podCreationTimestamp="2025-10-03 11:08:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:08:23.238737758 +0000 UTC m=+5085.035369615" watchObservedRunningTime="2025-10-03 11:08:23.243193842 +0000 UTC m=+5085.039825699" Oct 03 11:08:24 crc kubenswrapper[4990]: I1003 11:08:24.242081 4990 generic.go:334] "Generic (PLEG): container finished" podID="d17c4ea8-6680-421f-ade2-15a31e47fcf9" containerID="8ec4a268ed3978f68bf390d5b10546e128d801c4fe6383a17f5e33a8f41d7640" exitCode=0 Oct 03 11:08:24 crc kubenswrapper[4990]: I1003 11:08:24.242144 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"d17c4ea8-6680-421f-ade2-15a31e47fcf9","Type":"ContainerDied","Data":"8ec4a268ed3978f68bf390d5b10546e128d801c4fe6383a17f5e33a8f41d7640"} Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.699857 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.740744 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.746564 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.793118 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr765\" (UniqueName: \"kubernetes.io/projected/d17c4ea8-6680-421f-ade2-15a31e47fcf9-kube-api-access-lr765\") pod \"d17c4ea8-6680-421f-ade2-15a31e47fcf9\" (UID: \"d17c4ea8-6680-421f-ade2-15a31e47fcf9\") " Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.799995 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17c4ea8-6680-421f-ade2-15a31e47fcf9-kube-api-access-lr765" (OuterVolumeSpecName: "kube-api-access-lr765") pod "d17c4ea8-6680-421f-ade2-15a31e47fcf9" (UID: "d17c4ea8-6680-421f-ade2-15a31e47fcf9"). InnerVolumeSpecName "kube-api-access-lr765". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.871918 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:08:25 crc kubenswrapper[4990]: E1003 11:08:25.872402 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.897103 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr765\" (UniqueName: \"kubernetes.io/projected/d17c4ea8-6680-421f-ade2-15a31e47fcf9-kube-api-access-lr765\") on node \"crc\" DevicePath \"\"" Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.908948 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 11:08:25 crc kubenswrapper[4990]: E1003 11:08:25.909371 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17c4ea8-6680-421f-ade2-15a31e47fcf9" containerName="mariadb-client-6-default" Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.909396 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17c4ea8-6680-421f-ade2-15a31e47fcf9" containerName="mariadb-client-6-default" Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.909641 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17c4ea8-6680-421f-ade2-15a31e47fcf9" containerName="mariadb-client-6-default" Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.910263 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.921372 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 11:08:25 crc kubenswrapper[4990]: I1003 11:08:25.998719 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm99g\" (UniqueName: \"kubernetes.io/projected/da24b3b1-5c00-4af4-bca0-523be8980a02-kube-api-access-tm99g\") pod \"mariadb-client-7-default\" (UID: \"da24b3b1-5c00-4af4-bca0-523be8980a02\") " pod="openstack/mariadb-client-7-default" Oct 03 11:08:26 crc kubenswrapper[4990]: I1003 11:08:26.100830 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm99g\" (UniqueName: \"kubernetes.io/projected/da24b3b1-5c00-4af4-bca0-523be8980a02-kube-api-access-tm99g\") pod \"mariadb-client-7-default\" (UID: \"da24b3b1-5c00-4af4-bca0-523be8980a02\") " pod="openstack/mariadb-client-7-default" Oct 03 11:08:26 crc kubenswrapper[4990]: I1003 11:08:26.124428 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm99g\" (UniqueName: \"kubernetes.io/projected/da24b3b1-5c00-4af4-bca0-523be8980a02-kube-api-access-tm99g\") pod \"mariadb-client-7-default\" (UID: \"da24b3b1-5c00-4af4-bca0-523be8980a02\") " pod="openstack/mariadb-client-7-default" Oct 03 11:08:26 crc kubenswrapper[4990]: I1003 11:08:26.238042 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 11:08:26 crc kubenswrapper[4990]: I1003 11:08:26.260356 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62f99ffbcc462f0815cfd3ce6e984616ded051435045ef16eee25679ec2ed10f" Oct 03 11:08:26 crc kubenswrapper[4990]: I1003 11:08:26.260443 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 11:08:26 crc kubenswrapper[4990]: I1003 11:08:26.772495 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 11:08:26 crc kubenswrapper[4990]: I1003 11:08:26.884898 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17c4ea8-6680-421f-ade2-15a31e47fcf9" path="/var/lib/kubelet/pods/d17c4ea8-6680-421f-ade2-15a31e47fcf9/volumes" Oct 03 11:08:27 crc kubenswrapper[4990]: I1003 11:08:27.270792 4990 generic.go:334] "Generic (PLEG): container finished" podID="da24b3b1-5c00-4af4-bca0-523be8980a02" containerID="f6419d08b55d770887476b8aadb152f970666397f1c6ab82279420c9454d1658" exitCode=0 Oct 03 11:08:27 crc kubenswrapper[4990]: I1003 11:08:27.270858 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"da24b3b1-5c00-4af4-bca0-523be8980a02","Type":"ContainerDied","Data":"f6419d08b55d770887476b8aadb152f970666397f1c6ab82279420c9454d1658"} Oct 03 11:08:27 crc kubenswrapper[4990]: I1003 11:08:27.271117 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"da24b3b1-5c00-4af4-bca0-523be8980a02","Type":"ContainerStarted","Data":"f64aebb281ed5f97534b2d8112d9e3277b279c835632b362624370074ab9d040"} Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.740926 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.765824 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_da24b3b1-5c00-4af4-bca0-523be8980a02/mariadb-client-7-default/0.log" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.799680 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.809680 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.846146 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm99g\" (UniqueName: \"kubernetes.io/projected/da24b3b1-5c00-4af4-bca0-523be8980a02-kube-api-access-tm99g\") pod \"da24b3b1-5c00-4af4-bca0-523be8980a02\" (UID: \"da24b3b1-5c00-4af4-bca0-523be8980a02\") " Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.857344 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da24b3b1-5c00-4af4-bca0-523be8980a02-kube-api-access-tm99g" (OuterVolumeSpecName: "kube-api-access-tm99g") pod "da24b3b1-5c00-4af4-bca0-523be8980a02" (UID: "da24b3b1-5c00-4af4-bca0-523be8980a02"). InnerVolumeSpecName "kube-api-access-tm99g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.892313 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da24b3b1-5c00-4af4-bca0-523be8980a02" path="/var/lib/kubelet/pods/da24b3b1-5c00-4af4-bca0-523be8980a02/volumes" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.941726 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 03 11:08:28 crc kubenswrapper[4990]: E1003 11:08:28.942639 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da24b3b1-5c00-4af4-bca0-523be8980a02" containerName="mariadb-client-7-default" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.942659 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="da24b3b1-5c00-4af4-bca0-523be8980a02" containerName="mariadb-client-7-default" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.943089 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="da24b3b1-5c00-4af4-bca0-523be8980a02" containerName="mariadb-client-7-default" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.944249 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.952781 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm99g\" (UniqueName: \"kubernetes.io/projected/da24b3b1-5c00-4af4-bca0-523be8980a02-kube-api-access-tm99g\") on node \"crc\" DevicePath \"\"" Oct 03 11:08:28 crc kubenswrapper[4990]: I1003 11:08:28.954702 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 11:08:29 crc kubenswrapper[4990]: I1003 11:08:29.054630 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgh54\" (UniqueName: \"kubernetes.io/projected/dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2-kube-api-access-dgh54\") pod \"mariadb-client-2\" (UID: \"dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2\") " pod="openstack/mariadb-client-2" Oct 03 11:08:29 crc kubenswrapper[4990]: I1003 11:08:29.157192 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgh54\" (UniqueName: \"kubernetes.io/projected/dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2-kube-api-access-dgh54\") pod \"mariadb-client-2\" (UID: \"dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2\") " pod="openstack/mariadb-client-2" Oct 03 11:08:29 crc kubenswrapper[4990]: I1003 11:08:29.178247 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgh54\" (UniqueName: \"kubernetes.io/projected/dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2-kube-api-access-dgh54\") pod \"mariadb-client-2\" (UID: \"dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2\") " pod="openstack/mariadb-client-2" Oct 03 11:08:29 crc kubenswrapper[4990]: I1003 11:08:29.286829 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 11:08:29 crc kubenswrapper[4990]: I1003 11:08:29.295003 4990 scope.go:117] "RemoveContainer" containerID="f6419d08b55d770887476b8aadb152f970666397f1c6ab82279420c9454d1658" Oct 03 11:08:29 crc kubenswrapper[4990]: I1003 11:08:29.295151 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 11:08:29 crc kubenswrapper[4990]: I1003 11:08:29.675784 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 11:08:29 crc kubenswrapper[4990]: W1003 11:08:29.687799 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd5659e_bbb5_4e1c_af5f_dc186cf2fde2.slice/crio-589e6323508ec2b2541e00d4097af3a1f86ff2a687d36565607306b95721f927 WatchSource:0}: Error finding container 589e6323508ec2b2541e00d4097af3a1f86ff2a687d36565607306b95721f927: Status 404 returned error can't find the container with id 589e6323508ec2b2541e00d4097af3a1f86ff2a687d36565607306b95721f927 Oct 03 11:08:30 crc kubenswrapper[4990]: I1003 11:08:30.306915 4990 generic.go:334] "Generic (PLEG): container finished" podID="dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2" containerID="0588746fd1b3d51705868bf75254477c3bb78e258f976d1a27de508c9663f03b" exitCode=0 Oct 03 11:08:30 crc kubenswrapper[4990]: I1003 11:08:30.307049 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2","Type":"ContainerDied","Data":"0588746fd1b3d51705868bf75254477c3bb78e258f976d1a27de508c9663f03b"} Oct 03 11:08:30 crc kubenswrapper[4990]: I1003 11:08:30.308635 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2","Type":"ContainerStarted","Data":"589e6323508ec2b2541e00d4097af3a1f86ff2a687d36565607306b95721f927"} Oct 03 11:08:31 crc kubenswrapper[4990]: I1003 11:08:31.725892 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 11:08:31 crc kubenswrapper[4990]: I1003 11:08:31.746406 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2/mariadb-client-2/0.log" Oct 03 11:08:31 crc kubenswrapper[4990]: I1003 11:08:31.778118 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 11:08:31 crc kubenswrapper[4990]: I1003 11:08:31.785256 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 11:08:31 crc kubenswrapper[4990]: I1003 11:08:31.805632 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgh54\" (UniqueName: \"kubernetes.io/projected/dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2-kube-api-access-dgh54\") pod \"dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2\" (UID: \"dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2\") " Oct 03 11:08:31 crc kubenswrapper[4990]: I1003 11:08:31.812436 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2-kube-api-access-dgh54" (OuterVolumeSpecName: "kube-api-access-dgh54") pod "dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2" (UID: "dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2"). InnerVolumeSpecName "kube-api-access-dgh54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:08:31 crc kubenswrapper[4990]: I1003 11:08:31.907208 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgh54\" (UniqueName: \"kubernetes.io/projected/dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2-kube-api-access-dgh54\") on node \"crc\" DevicePath \"\"" Oct 03 11:08:32 crc kubenswrapper[4990]: I1003 11:08:32.325874 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589e6323508ec2b2541e00d4097af3a1f86ff2a687d36565607306b95721f927" Oct 03 11:08:32 crc kubenswrapper[4990]: I1003 11:08:32.325949 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 11:08:32 crc kubenswrapper[4990]: I1003 11:08:32.878874 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2" path="/var/lib/kubelet/pods/dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2/volumes" Oct 03 11:08:38 crc kubenswrapper[4990]: I1003 11:08:38.882552 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:08:38 crc kubenswrapper[4990]: E1003 11:08:38.883406 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:08:53 crc kubenswrapper[4990]: I1003 11:08:53.871729 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:08:53 crc kubenswrapper[4990]: E1003 11:08:53.872596 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:09:07 crc kubenswrapper[4990]: I1003 11:09:07.873393 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:09:07 crc kubenswrapper[4990]: E1003 11:09:07.874621 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:09:22 crc kubenswrapper[4990]: I1003 11:09:22.872439 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:09:22 crc kubenswrapper[4990]: E1003 11:09:22.874167 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:09:36 crc kubenswrapper[4990]: I1003 11:09:36.872203 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:09:36 crc kubenswrapper[4990]: E1003 11:09:36.873622 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:09:51 crc kubenswrapper[4990]: I1003 11:09:51.873215 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:09:51 crc kubenswrapper[4990]: E1003 11:09:51.874186 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:10:06 crc kubenswrapper[4990]: I1003 11:10:06.872070 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:10:06 crc kubenswrapper[4990]: E1003 11:10:06.873211 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:10:18 crc kubenswrapper[4990]: I1003 11:10:18.881403 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:10:18 crc kubenswrapper[4990]: E1003 11:10:18.883295 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:10:32 crc kubenswrapper[4990]: I1003 11:10:32.872287 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:10:32 crc kubenswrapper[4990]: E1003 11:10:32.872934 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:10:43 crc kubenswrapper[4990]: I1003 11:10:43.872426 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:10:43 crc kubenswrapper[4990]: E1003 11:10:43.873612 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:10:47 crc kubenswrapper[4990]: I1003 11:10:47.867262 4990 scope.go:117] "RemoveContainer" containerID="77390e486cd11435c49886bcbdb10706a8d04cc11b67d5c425785e23b06ede9e" Oct 03 11:10:58 crc kubenswrapper[4990]: I1003 11:10:58.872125 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:10:58 crc kubenswrapper[4990]: E1003 11:10:58.873841 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:11:11 crc kubenswrapper[4990]: I1003 11:11:11.872711 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:11:11 crc kubenswrapper[4990]: E1003 11:11:11.873730 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:11:24 crc kubenswrapper[4990]: I1003 11:11:24.872195 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:11:24 crc kubenswrapper[4990]: E1003 11:11:24.873090 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:11:37 crc kubenswrapper[4990]: I1003 11:11:37.873500 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:11:37 crc kubenswrapper[4990]: E1003 11:11:37.874693 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:11:52 crc kubenswrapper[4990]: I1003 11:11:52.871306 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:11:52 crc kubenswrapper[4990]: E1003 11:11:52.872076 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.178096 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 11:11:56 crc kubenswrapper[4990]: E1003 11:11:56.179047 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2" containerName="mariadb-client-2" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.179073 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2" containerName="mariadb-client-2" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.179368 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd5659e-bbb5-4e1c-af5f-dc186cf2fde2" containerName="mariadb-client-2" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.180187 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.186065 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bfg69" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.193743 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.247366 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\") pod \"mariadb-copy-data\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") " pod="openstack/mariadb-copy-data" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.247566 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rctf\" (UniqueName: \"kubernetes.io/projected/0b9a6e33-94e4-4df7-9d5d-dcadfc621424-kube-api-access-6rctf\") pod \"mariadb-copy-data\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") " pod="openstack/mariadb-copy-data" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.348149 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\") pod \"mariadb-copy-data\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") " pod="openstack/mariadb-copy-data" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.348245 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rctf\" (UniqueName: \"kubernetes.io/projected/0b9a6e33-94e4-4df7-9d5d-dcadfc621424-kube-api-access-6rctf\") pod \"mariadb-copy-data\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") " pod="openstack/mariadb-copy-data" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.352319 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.352362 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\") pod \"mariadb-copy-data\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7bd9c0732947513729ddc9b514c034bfe867c0cd459d3ebb877f73c9623c4272/globalmount\"" pod="openstack/mariadb-copy-data" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.368428 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rctf\" (UniqueName: \"kubernetes.io/projected/0b9a6e33-94e4-4df7-9d5d-dcadfc621424-kube-api-access-6rctf\") pod \"mariadb-copy-data\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") " pod="openstack/mariadb-copy-data" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.388204 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\") pod \"mariadb-copy-data\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") " pod="openstack/mariadb-copy-data" Oct 03 11:11:56 crc kubenswrapper[4990]: I1003 11:11:56.538259 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 11:11:57 crc kubenswrapper[4990]: I1003 11:11:57.104719 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 11:11:57 crc kubenswrapper[4990]: I1003 11:11:57.296083 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0b9a6e33-94e4-4df7-9d5d-dcadfc621424","Type":"ContainerStarted","Data":"808b2abfb8a8b141d0378e7c4cd776d8745c9df1135258a70843b81abdf00514"} Oct 03 11:11:58 crc kubenswrapper[4990]: I1003 11:11:58.305556 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0b9a6e33-94e4-4df7-9d5d-dcadfc621424","Type":"ContainerStarted","Data":"eaac1eb10a5eb2bae783302519adf0ef16f9b747b48e02bac1a6672913d2f3c8"} Oct 03 11:11:58 crc kubenswrapper[4990]: I1003 11:11:58.331780 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.331756789 podStartE2EDuration="3.331756789s" podCreationTimestamp="2025-10-03 11:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:11:58.326459915 +0000 UTC m=+5300.123091812" watchObservedRunningTime="2025-10-03 11:11:58.331756789 +0000 UTC m=+5300.128388656" Oct 03 11:12:00 crc kubenswrapper[4990]: I1003 11:12:00.622239 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:00 crc kubenswrapper[4990]: I1003 11:12:00.624214 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 11:12:00 crc kubenswrapper[4990]: I1003 11:12:00.630399 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:00 crc kubenswrapper[4990]: I1003 11:12:00.725842 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcm22\" (UniqueName: \"kubernetes.io/projected/4b547c92-16ed-4064-b878-e28f58446d6a-kube-api-access-wcm22\") pod \"mariadb-client\" (UID: \"4b547c92-16ed-4064-b878-e28f58446d6a\") " pod="openstack/mariadb-client" Oct 03 11:12:00 crc kubenswrapper[4990]: I1003 11:12:00.826324 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcm22\" (UniqueName: \"kubernetes.io/projected/4b547c92-16ed-4064-b878-e28f58446d6a-kube-api-access-wcm22\") pod \"mariadb-client\" (UID: \"4b547c92-16ed-4064-b878-e28f58446d6a\") " pod="openstack/mariadb-client" Oct 03 11:12:00 crc kubenswrapper[4990]: I1003 11:12:00.850812 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcm22\" (UniqueName: \"kubernetes.io/projected/4b547c92-16ed-4064-b878-e28f58446d6a-kube-api-access-wcm22\") pod \"mariadb-client\" (UID: \"4b547c92-16ed-4064-b878-e28f58446d6a\") " pod="openstack/mariadb-client" Oct 03 11:12:00 crc kubenswrapper[4990]: I1003 11:12:00.963839 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 11:12:01 crc kubenswrapper[4990]: I1003 11:12:01.388639 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:01 crc kubenswrapper[4990]: W1003 11:12:01.397138 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b547c92_16ed_4064_b878_e28f58446d6a.slice/crio-6bb3d147b9e48545eed07636c77d7a8733228fe03479cbc65bffabc8d2c1950d WatchSource:0}: Error finding container 6bb3d147b9e48545eed07636c77d7a8733228fe03479cbc65bffabc8d2c1950d: Status 404 returned error can't find the container with id 6bb3d147b9e48545eed07636c77d7a8733228fe03479cbc65bffabc8d2c1950d Oct 03 11:12:02 crc kubenswrapper[4990]: I1003 11:12:02.340836 4990 generic.go:334] "Generic (PLEG): container finished" podID="4b547c92-16ed-4064-b878-e28f58446d6a" containerID="c4e2610ba2eb582e91d1cbad6ef217d3f68c47d112529a1e2c78a8e225042006" exitCode=0 Oct 03 11:12:02 crc kubenswrapper[4990]: I1003 11:12:02.340900 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4b547c92-16ed-4064-b878-e28f58446d6a","Type":"ContainerDied","Data":"c4e2610ba2eb582e91d1cbad6ef217d3f68c47d112529a1e2c78a8e225042006"} Oct 03 11:12:02 crc kubenswrapper[4990]: I1003 11:12:02.340936 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4b547c92-16ed-4064-b878-e28f58446d6a","Type":"ContainerStarted","Data":"6bb3d147b9e48545eed07636c77d7a8733228fe03479cbc65bffabc8d2c1950d"} Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.667482 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.672614 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcm22\" (UniqueName: \"kubernetes.io/projected/4b547c92-16ed-4064-b878-e28f58446d6a-kube-api-access-wcm22\") pod \"4b547c92-16ed-4064-b878-e28f58446d6a\" (UID: \"4b547c92-16ed-4064-b878-e28f58446d6a\") " Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.681202 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b547c92-16ed-4064-b878-e28f58446d6a-kube-api-access-wcm22" (OuterVolumeSpecName: "kube-api-access-wcm22") pod "4b547c92-16ed-4064-b878-e28f58446d6a" (UID: "4b547c92-16ed-4064-b878-e28f58446d6a"). InnerVolumeSpecName "kube-api-access-wcm22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.688008 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_4b547c92-16ed-4064-b878-e28f58446d6a/mariadb-client/0.log" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.716849 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.722554 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.773731 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcm22\" (UniqueName: \"kubernetes.io/projected/4b547c92-16ed-4064-b878-e28f58446d6a-kube-api-access-wcm22\") on node \"crc\" DevicePath \"\"" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.851025 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:03 crc kubenswrapper[4990]: E1003 11:12:03.851614 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b547c92-16ed-4064-b878-e28f58446d6a" containerName="mariadb-client" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.851646 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b547c92-16ed-4064-b878-e28f58446d6a" containerName="mariadb-client" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.851851 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b547c92-16ed-4064-b878-e28f58446d6a" containerName="mariadb-client" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.852633 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.860842 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.875114 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf5rt\" (UniqueName: \"kubernetes.io/projected/1124a412-a34d-4676-8f5d-7291e4844ef6-kube-api-access-kf5rt\") pod \"mariadb-client\" (UID: \"1124a412-a34d-4676-8f5d-7291e4844ef6\") " pod="openstack/mariadb-client" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.977889 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf5rt\" (UniqueName: \"kubernetes.io/projected/1124a412-a34d-4676-8f5d-7291e4844ef6-kube-api-access-kf5rt\") pod \"mariadb-client\" (UID: \"1124a412-a34d-4676-8f5d-7291e4844ef6\") " pod="openstack/mariadb-client" Oct 03 11:12:03 crc kubenswrapper[4990]: I1003 11:12:03.994686 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf5rt\" (UniqueName: \"kubernetes.io/projected/1124a412-a34d-4676-8f5d-7291e4844ef6-kube-api-access-kf5rt\") pod \"mariadb-client\" (UID: \"1124a412-a34d-4676-8f5d-7291e4844ef6\") " pod="openstack/mariadb-client" Oct 03 11:12:04 crc kubenswrapper[4990]: I1003 11:12:04.188954 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 11:12:04 crc kubenswrapper[4990]: I1003 11:12:04.361839 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb3d147b9e48545eed07636c77d7a8733228fe03479cbc65bffabc8d2c1950d" Oct 03 11:12:04 crc kubenswrapper[4990]: I1003 11:12:04.361914 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 11:12:04 crc kubenswrapper[4990]: I1003 11:12:04.378496 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="4b547c92-16ed-4064-b878-e28f58446d6a" podUID="1124a412-a34d-4676-8f5d-7291e4844ef6" Oct 03 11:12:04 crc kubenswrapper[4990]: I1003 11:12:04.433638 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:04 crc kubenswrapper[4990]: W1003 11:12:04.441596 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1124a412_a34d_4676_8f5d_7291e4844ef6.slice/crio-e3405e0283d4b8b2ad86690bf0731c1624f1b71552a06e145ad7119d4bdc8cc6 WatchSource:0}: Error finding container e3405e0283d4b8b2ad86690bf0731c1624f1b71552a06e145ad7119d4bdc8cc6: Status 404 returned error can't find the container with id e3405e0283d4b8b2ad86690bf0731c1624f1b71552a06e145ad7119d4bdc8cc6 Oct 03 11:12:04 crc kubenswrapper[4990]: I1003 11:12:04.886947 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b547c92-16ed-4064-b878-e28f58446d6a" path="/var/lib/kubelet/pods/4b547c92-16ed-4064-b878-e28f58446d6a/volumes" Oct 03 11:12:05 crc kubenswrapper[4990]: I1003 11:12:05.370539 4990 generic.go:334] "Generic (PLEG): container finished" podID="1124a412-a34d-4676-8f5d-7291e4844ef6" containerID="69789bd9497f4ef7adc5eacd9af53d530516a2fb0d3a3ccdf566bf51125673c4" exitCode=0 Oct 03 11:12:05 crc kubenswrapper[4990]: I1003 11:12:05.370590 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"1124a412-a34d-4676-8f5d-7291e4844ef6","Type":"ContainerDied","Data":"69789bd9497f4ef7adc5eacd9af53d530516a2fb0d3a3ccdf566bf51125673c4"} Oct 03 11:12:05 crc kubenswrapper[4990]: I1003 11:12:05.370650 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"1124a412-a34d-4676-8f5d-7291e4844ef6","Type":"ContainerStarted","Data":"e3405e0283d4b8b2ad86690bf0731c1624f1b71552a06e145ad7119d4bdc8cc6"} Oct 03 11:12:05 crc kubenswrapper[4990]: I1003 11:12:05.872427 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.378404 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"45f44d1104d8a5f414e2f55b32a55a6324ac5627c3c486e3590389dd674e36eb"} Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.679555 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.694735 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_1124a412-a34d-4676-8f5d-7291e4844ef6/mariadb-client/0.log" Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.724507 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.731074 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.825920 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf5rt\" (UniqueName: \"kubernetes.io/projected/1124a412-a34d-4676-8f5d-7291e4844ef6-kube-api-access-kf5rt\") pod \"1124a412-a34d-4676-8f5d-7291e4844ef6\" (UID: \"1124a412-a34d-4676-8f5d-7291e4844ef6\") " Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.844694 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1124a412-a34d-4676-8f5d-7291e4844ef6-kube-api-access-kf5rt" (OuterVolumeSpecName: "kube-api-access-kf5rt") pod "1124a412-a34d-4676-8f5d-7291e4844ef6" (UID: "1124a412-a34d-4676-8f5d-7291e4844ef6"). InnerVolumeSpecName "kube-api-access-kf5rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.880621 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1124a412-a34d-4676-8f5d-7291e4844ef6" path="/var/lib/kubelet/pods/1124a412-a34d-4676-8f5d-7291e4844ef6/volumes" Oct 03 11:12:06 crc kubenswrapper[4990]: I1003 11:12:06.927691 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf5rt\" (UniqueName: \"kubernetes.io/projected/1124a412-a34d-4676-8f5d-7291e4844ef6-kube-api-access-kf5rt\") on node \"crc\" DevicePath \"\"" Oct 03 11:12:07 crc kubenswrapper[4990]: I1003 11:12:07.390436 4990 scope.go:117] "RemoveContainer" containerID="69789bd9497f4ef7adc5eacd9af53d530516a2fb0d3a3ccdf566bf51125673c4" Oct 03 11:12:07 crc kubenswrapper[4990]: I1003 11:12:07.390564 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.933852 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 11:12:42 crc kubenswrapper[4990]: E1003 11:12:42.934942 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1124a412-a34d-4676-8f5d-7291e4844ef6" containerName="mariadb-client" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.934965 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1124a412-a34d-4676-8f5d-7291e4844ef6" containerName="mariadb-client" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.935254 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1124a412-a34d-4676-8f5d-7291e4844ef6" containerName="mariadb-client" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.936695 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.938563 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.938883 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.939291 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.939448 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bdpg9" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.939658 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.953757 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.978238 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.980910 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.988087 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 11:12:42 crc kubenswrapper[4990]: I1003 11:12:42.996887 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.008931 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.022045 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.051790 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdkz\" (UniqueName: \"kubernetes.io/projected/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-kube-api-access-9wdkz\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.051858 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.051907 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.051941 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.051973 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.051994 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.052044 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-648391c1-0205-4f8e-ada0-58484bb0353a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-648391c1-0205-4f8e-ada0-58484bb0353a\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.052079 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-config\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.119052 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.121419 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.129288 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.130810 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.132839 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4vwqv" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.132851 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.134072 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.134398 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.140654 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153111 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153149 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153166 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-config\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153186 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153202 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153222 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153240 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153259 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153282 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153310 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91033b00-fa38-456f-b530-86e9a3570cd4-config\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153329 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91033b00-fa38-456f-b530-86e9a3570cd4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153347 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-648391c1-0205-4f8e-ada0-58484bb0353a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-648391c1-0205-4f8e-ada0-58484bb0353a\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153364 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwnzx\" (UniqueName: \"kubernetes.io/projected/91033b00-fa38-456f-b530-86e9a3570cd4-kube-api-access-hwnzx\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153385 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91033b00-fa38-456f-b530-86e9a3570cd4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153403 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-config\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153425 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2cc80937-d6dc-4a57-9aa4-2ae42b57ed1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cc80937-d6dc-4a57-9aa4-2ae42b57ed1c\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153454 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153474 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153491 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13b0a35f-c02e-4bfb-9740-3df58a816756\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13b0a35f-c02e-4bfb-9740-3df58a816756\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153531 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdkz\" (UniqueName: \"kubernetes.io/projected/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-kube-api-access-9wdkz\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153546 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckpb\" (UniqueName: \"kubernetes.io/projected/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-kube-api-access-bckpb\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153569 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153589 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153602 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.153658 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.154608 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.155006 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-config\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.155430 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.155928 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.159898 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.159941 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-648391c1-0205-4f8e-ada0-58484bb0353a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-648391c1-0205-4f8e-ada0-58484bb0353a\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/187af39f98315a8e45e1e450035f6317f8b6426d4137e51e89fff20dfdf750db/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.163910 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.169827 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.173813 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.174772 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.184702 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdkz\" (UniqueName: \"kubernetes.io/projected/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-kube-api-access-9wdkz\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.186875 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b02ea1a2-ca33-4f2c-8279-d7b210f06c02-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.223558 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-648391c1-0205-4f8e-ada0-58484bb0353a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-648391c1-0205-4f8e-ada0-58484bb0353a\") pod \"ovsdbserver-nb-0\" (UID: \"b02ea1a2-ca33-4f2c-8279-d7b210f06c02\") " pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.255031 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.255076 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-config\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.255191 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.255259 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.255309 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.255358 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.255399 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-487cc6b8-5fd5-4e50-9acd-aa50355ca6c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-487cc6b8-5fd5-4e50-9acd-aa50355ca6c0\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.255791 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-config\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256100 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6311fec4-d49f-4a01-a145-8e9afc3dded6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256168 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-config\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256202 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256339 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-670b14d3-c321-4d97-980d-2c35843a14c8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-670b14d3-c321-4d97-980d-2c35843a14c8\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256374 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6311fec4-d49f-4a01-a145-8e9afc3dded6-config\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256398 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256444 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91033b00-fa38-456f-b530-86e9a3570cd4-config\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256545 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91033b00-fa38-456f-b530-86e9a3570cd4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256578 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9e9fcc79-1e48-4c29-bcc8-2541c8a11099\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e9fcc79-1e48-4c29-bcc8-2541c8a11099\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256648 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwnzx\" (UniqueName: \"kubernetes.io/projected/91033b00-fa38-456f-b530-86e9a3570cd4-kube-api-access-hwnzx\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256671 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-config\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256736 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256787 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91033b00-fa38-456f-b530-86e9a3570cd4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256822 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256926 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256978 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbbt\" (UniqueName: \"kubernetes.io/projected/6311fec4-d49f-4a01-a145-8e9afc3dded6-kube-api-access-spbbt\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.256990 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91033b00-fa38-456f-b530-86e9a3570cd4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.257022 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2cc80937-d6dc-4a57-9aa4-2ae42b57ed1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cc80937-d6dc-4a57-9aa4-2ae42b57ed1c\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.257075 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkts9\" (UniqueName: \"kubernetes.io/projected/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-kube-api-access-jkts9\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.257166 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6311fec4-d49f-4a01-a145-8e9afc3dded6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.257218 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.257265 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.257391 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.257733 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91033b00-fa38-456f-b530-86e9a3570cd4-config\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258159 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrzn\" (UniqueName: \"kubernetes.io/projected/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-kube-api-access-pmrzn\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258267 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258338 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91033b00-fa38-456f-b530-86e9a3570cd4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258346 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258396 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13b0a35f-c02e-4bfb-9740-3df58a816756\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13b0a35f-c02e-4bfb-9740-3df58a816756\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258402 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258427 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bckpb\" (UniqueName: \"kubernetes.io/projected/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-kube-api-access-bckpb\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258460 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258481 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258548 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258569 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258587 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258616 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.258985 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.260527 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.260565 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2cc80937-d6dc-4a57-9aa4-2ae42b57ed1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cc80937-d6dc-4a57-9aa4-2ae42b57ed1c\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66d4167bff633fc484e2fec38d4c8eaf05c84044500bd42278ac8e5744b396a7/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.261349 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.261705 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.262183 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.262376 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.264225 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.264556 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13b0a35f-c02e-4bfb-9740-3df58a816756\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13b0a35f-c02e-4bfb-9740-3df58a816756\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/911269ef1c9603af08b000e90352fffc919653f752f5ba38e004bdd9e933a999/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.265029 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91033b00-fa38-456f-b530-86e9a3570cd4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.276743 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckpb\" (UniqueName: \"kubernetes.io/projected/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-kube-api-access-bckpb\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.278412 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.286183 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ba951f-b8cb-45d4-8c4d-2d43a008877b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.293131 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwnzx\" (UniqueName: \"kubernetes.io/projected/91033b00-fa38-456f-b530-86e9a3570cd4-kube-api-access-hwnzx\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.306318 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2cc80937-d6dc-4a57-9aa4-2ae42b57ed1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cc80937-d6dc-4a57-9aa4-2ae42b57ed1c\") pod \"ovsdbserver-nb-1\" (UID: \"91033b00-fa38-456f-b530-86e9a3570cd4\") " pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.311686 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13b0a35f-c02e-4bfb-9740-3df58a816756\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13b0a35f-c02e-4bfb-9740-3df58a816756\") pod \"ovsdbserver-nb-2\" (UID: \"a6ba951f-b8cb-45d4-8c4d-2d43a008877b\") " pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.319329 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.360937 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361253 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361299 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361325 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361375 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361412 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361458 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-487cc6b8-5fd5-4e50-9acd-aa50355ca6c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-487cc6b8-5fd5-4e50-9acd-aa50355ca6c0\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361483 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6311fec4-d49f-4a01-a145-8e9afc3dded6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361548 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-config\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361577 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-670b14d3-c321-4d97-980d-2c35843a14c8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-670b14d3-c321-4d97-980d-2c35843a14c8\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361619 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6311fec4-d49f-4a01-a145-8e9afc3dded6-config\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361641 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361698 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9e9fcc79-1e48-4c29-bcc8-2541c8a11099\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e9fcc79-1e48-4c29-bcc8-2541c8a11099\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361724 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-config\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361744 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361791 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361817 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361864 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbbt\" (UniqueName: \"kubernetes.io/projected/6311fec4-d49f-4a01-a145-8e9afc3dded6-kube-api-access-spbbt\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361895 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkts9\" (UniqueName: \"kubernetes.io/projected/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-kube-api-access-jkts9\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361931 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6311fec4-d49f-4a01-a145-8e9afc3dded6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.361992 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.362017 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.362052 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrzn\" (UniqueName: \"kubernetes.io/projected/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-kube-api-access-pmrzn\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.362078 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.362534 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6311fec4-d49f-4a01-a145-8e9afc3dded6-config\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.362818 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6311fec4-d49f-4a01-a145-8e9afc3dded6-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.363557 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-config\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.365035 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.365655 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.366457 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.367619 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.367650 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-config\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.368042 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6311fec4-d49f-4a01-a145-8e9afc3dded6-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.369252 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.369289 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-487cc6b8-5fd5-4e50-9acd-aa50355ca6c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-487cc6b8-5fd5-4e50-9acd-aa50355ca6c0\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/572b63168df2727551382e4f52131a8f45760fda5acd9fb0f14c4459259d526c/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.372890 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.372918 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9e9fcc79-1e48-4c29-bcc8-2541c8a11099\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e9fcc79-1e48-4c29-bcc8-2541c8a11099\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/44a65defaab22c9427cacb3d6c07d162025c8c6d70c6e9951ee5acd92ae33ba5/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.373203 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.374297 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.376186 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.376794 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.377361 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.377820 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.378360 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.378354 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6311fec4-d49f-4a01-a145-8e9afc3dded6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.379865 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.380050 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.380088 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-670b14d3-c321-4d97-980d-2c35843a14c8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-670b14d3-c321-4d97-980d-2c35843a14c8\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/000563501134d21442ee0d92b4e0160decc7010e0094a3be5314e6d68bae9d9b/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.387430 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkts9\" (UniqueName: \"kubernetes.io/projected/82cfebd8-f54d-4ea6-8c90-8f125ab5f21d-kube-api-access-jkts9\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.387468 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrzn\" (UniqueName: \"kubernetes.io/projected/5bcb4406-33eb-4a5a-8e65-2ab0267ed97b-kube-api-access-pmrzn\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.389884 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbbt\" (UniqueName: \"kubernetes.io/projected/6311fec4-d49f-4a01-a145-8e9afc3dded6-kube-api-access-spbbt\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.422801 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-487cc6b8-5fd5-4e50-9acd-aa50355ca6c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-487cc6b8-5fd5-4e50-9acd-aa50355ca6c0\") pod \"ovsdbserver-sb-1\" (UID: \"6311fec4-d49f-4a01-a145-8e9afc3dded6\") " pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.431240 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-670b14d3-c321-4d97-980d-2c35843a14c8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-670b14d3-c321-4d97-980d-2c35843a14c8\") pod \"ovsdbserver-sb-0\" (UID: \"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b\") " pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.433585 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9e9fcc79-1e48-4c29-bcc8-2541c8a11099\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9e9fcc79-1e48-4c29-bcc8-2541c8a11099\") pod \"ovsdbserver-sb-2\" (UID: \"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d\") " pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.446721 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.465032 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.536983 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.601915 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.796010 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 11:12:43 crc kubenswrapper[4990]: I1003 11:12:43.912106 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.005069 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 11:12:44 crc kubenswrapper[4990]: W1003 11:12:44.007805 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82cfebd8_f54d_4ea6_8c90_8f125ab5f21d.slice/crio-c497b954dd08c5b17cec7f246bc326199b6dd10f803707ab27fcfb71dddcecfd WatchSource:0}: Error finding container c497b954dd08c5b17cec7f246bc326199b6dd10f803707ab27fcfb71dddcecfd: Status 404 returned error can't find the container with id c497b954dd08c5b17cec7f246bc326199b6dd10f803707ab27fcfb71dddcecfd Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.108776 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.198740 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.721079 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6311fec4-d49f-4a01-a145-8e9afc3dded6","Type":"ContainerStarted","Data":"ce6c16846021d086e50f941b2e0f5847ff6f669f0131a191bfc2c91a2650eebe"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.721129 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6311fec4-d49f-4a01-a145-8e9afc3dded6","Type":"ContainerStarted","Data":"a13ec6daf53ff7ea7d11873720ea413b53976d63a1ce0dc4387a79550e954abd"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.721160 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6311fec4-d49f-4a01-a145-8e9afc3dded6","Type":"ContainerStarted","Data":"59ca593bb5a90ae0820f00b5dc595328851eaea1c6657abcac6142a16242c449"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.722879 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b02ea1a2-ca33-4f2c-8279-d7b210f06c02","Type":"ContainerStarted","Data":"e702ddecf8c293caf74f643ada01d22177101a2e5c45a20c796926ada6850cf6"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.722938 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b02ea1a2-ca33-4f2c-8279-d7b210f06c02","Type":"ContainerStarted","Data":"84fdf5f674f49abf5171a0af36ce9998e535bc534dbbc4626ec3eb0be49fa9f8"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.722955 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b02ea1a2-ca33-4f2c-8279-d7b210f06c02","Type":"ContainerStarted","Data":"db48547a8a28e2374d034fdf342b254694606679982e50491ed25b92909c5644"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.724367 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"91033b00-fa38-456f-b530-86e9a3570cd4","Type":"ContainerStarted","Data":"2c3d6f2caea6120cb4faee259284898900c866427467c727738a63c3dcece860"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.724405 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"91033b00-fa38-456f-b530-86e9a3570cd4","Type":"ContainerStarted","Data":"a9d6edd0b49d5d1fe958111dc4fa2f1753d7150ee7c6a77e5b414cd0581b2eb1"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.724421 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"91033b00-fa38-456f-b530-86e9a3570cd4","Type":"ContainerStarted","Data":"e68eb35a13ff19ebe4474edb5300cfa858df1c73fb48a96972a43aab98887dd6"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.726067 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"a6ba951f-b8cb-45d4-8c4d-2d43a008877b","Type":"ContainerStarted","Data":"2588877fccf92842e860734ac2329e98f4ed90e23d5d6ed0198e26d61d2e9be0"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.726106 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"a6ba951f-b8cb-45d4-8c4d-2d43a008877b","Type":"ContainerStarted","Data":"b31245ce6e696052877a5b250d6e2e8fca26ea14f82bdf93e7d0c3579c5fef2c"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.726119 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"a6ba951f-b8cb-45d4-8c4d-2d43a008877b","Type":"ContainerStarted","Data":"74593ca20d34e797c7476c5ccce459dbc44b27259966d28a43f4a54947a5a53a"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.727792 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d","Type":"ContainerStarted","Data":"43049ca1f2d6ecba11b46bca639b40f610f9d5949224a9e59c19dafc13605830"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.727825 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d","Type":"ContainerStarted","Data":"01da4ca91742985303c83904508aa97eb810f690b592aeef3dbbf3c9d7271f31"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.727838 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"82cfebd8-f54d-4ea6-8c90-8f125ab5f21d","Type":"ContainerStarted","Data":"c497b954dd08c5b17cec7f246bc326199b6dd10f803707ab27fcfb71dddcecfd"} Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.751696 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=2.751676891 podStartE2EDuration="2.751676891s" podCreationTimestamp="2025-10-03 11:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:12:44.741904933 +0000 UTC m=+5346.538536800" watchObservedRunningTime="2025-10-03 11:12:44.751676891 +0000 UTC m=+5346.548308748" Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.763802 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.763778717 podStartE2EDuration="3.763778717s" podCreationTimestamp="2025-10-03 11:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:12:44.756999566 +0000 UTC m=+5346.553631433" watchObservedRunningTime="2025-10-03 11:12:44.763778717 +0000 UTC m=+5346.560410574" Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.777033 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.777010753 podStartE2EDuration="3.777010753s" podCreationTimestamp="2025-10-03 11:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:12:44.774624242 +0000 UTC m=+5346.571256099" watchObservedRunningTime="2025-10-03 11:12:44.777010753 +0000 UTC m=+5346.573642610" Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.805834 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.805809993 podStartE2EDuration="3.805809993s" podCreationTimestamp="2025-10-03 11:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:12:44.80057373 +0000 UTC m=+5346.597205607" watchObservedRunningTime="2025-10-03 11:12:44.805809993 +0000 UTC m=+5346.602441860" Oct 03 11:12:44 crc kubenswrapper[4990]: I1003 11:12:44.823715 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=2.823697926 podStartE2EDuration="2.823697926s" podCreationTimestamp="2025-10-03 11:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:12:44.821131111 +0000 UTC m=+5346.617762988" watchObservedRunningTime="2025-10-03 11:12:44.823697926 +0000 UTC m=+5346.620329773" Oct 03 11:12:45 crc kubenswrapper[4990]: I1003 11:12:45.033052 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 11:12:45 crc kubenswrapper[4990]: W1003 11:12:45.038166 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bcb4406_33eb_4a5a_8e65_2ab0267ed97b.slice/crio-04b81460cda9f309620af5bffe73a3368d7aca30bb5c3750823dea32362da416 WatchSource:0}: Error finding container 04b81460cda9f309620af5bffe73a3368d7aca30bb5c3750823dea32362da416: Status 404 returned error can't find the container with id 04b81460cda9f309620af5bffe73a3368d7aca30bb5c3750823dea32362da416 Oct 03 11:12:45 crc kubenswrapper[4990]: I1003 11:12:45.738354 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b","Type":"ContainerStarted","Data":"50fea9fdcef4d577cf4503c5490155ec1fb0aada1b700d0636e69b2373b966a5"} Oct 03 11:12:45 crc kubenswrapper[4990]: I1003 11:12:45.738393 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b","Type":"ContainerStarted","Data":"01e54314e47f7cf7fab5b5206d4a90444d7d47a89aa5964d7323fad5e0bfa2a2"} Oct 03 11:12:45 crc kubenswrapper[4990]: I1003 11:12:45.738402 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5bcb4406-33eb-4a5a-8e65-2ab0267ed97b","Type":"ContainerStarted","Data":"04b81460cda9f309620af5bffe73a3368d7aca30bb5c3750823dea32362da416"} Oct 03 11:12:45 crc kubenswrapper[4990]: I1003 11:12:45.770120 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.770100765 podStartE2EDuration="3.770100765s" podCreationTimestamp="2025-10-03 11:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:12:45.765073798 +0000 UTC m=+5347.561705655" watchObservedRunningTime="2025-10-03 11:12:45.770100765 +0000 UTC m=+5347.566732622" Oct 03 11:12:46 crc kubenswrapper[4990]: I1003 11:12:46.262750 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:46 crc kubenswrapper[4990]: I1003 11:12:46.319856 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:46 crc kubenswrapper[4990]: I1003 11:12:46.323576 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:46 crc kubenswrapper[4990]: I1003 11:12:46.447673 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:46 crc kubenswrapper[4990]: I1003 11:12:46.466060 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:46 crc kubenswrapper[4990]: I1003 11:12:46.538035 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:46 crc kubenswrapper[4990]: I1003 11:12:46.603254 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:46 crc kubenswrapper[4990]: I1003 11:12:46.746877 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.319995 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.339273 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.447066 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.465563 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.538126 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.602959 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.654198 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cb8648879-q2qjd"] Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.663192 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.665695 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.666815 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb8648879-q2qjd"] Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.768669 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.768716 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-config\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.768746 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-dns-svc\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.768793 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9jhs\" (UniqueName: \"kubernetes.io/projected/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-kube-api-access-g9jhs\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.869933 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9jhs\" (UniqueName: \"kubernetes.io/projected/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-kube-api-access-g9jhs\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.870104 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.870175 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-config\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.870212 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-dns-svc\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.871034 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.872060 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-config\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.872722 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-dns-svc\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.901278 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9jhs\" (UniqueName: \"kubernetes.io/projected/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-kube-api-access-g9jhs\") pod \"dnsmasq-dns-5cb8648879-q2qjd\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:48 crc kubenswrapper[4990]: I1003 11:12:48.984597 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.355371 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.400081 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.486289 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.510641 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.556072 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.570140 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb8648879-q2qjd"] Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.603964 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.653601 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.657076 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.760738 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.781454 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" event={"ID":"15daae7e-fe8a-4e65-b975-67f1e9ba09f4","Type":"ContainerStarted","Data":"cc9dbb403c324666eef056f3f52702c37aee41c97319d6c32ca9a762cec047f9"} Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.880330 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb8648879-q2qjd"] Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.911420 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fff56db7f-bqr9r"] Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.912810 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.914870 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.924041 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fff56db7f-bqr9r"] Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.989214 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-config\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.989297 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-dns-svc\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.989351 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-sb\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.989394 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-nb\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:49 crc kubenswrapper[4990]: I1003 11:12:49.989420 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvcn\" (UniqueName: \"kubernetes.io/projected/a093e3ba-92f1-4c4d-b235-6409528c32c3-kube-api-access-gnvcn\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.091069 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-config\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.091128 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-dns-svc\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.091180 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-sb\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.091219 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-nb\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.091237 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnvcn\" (UniqueName: \"kubernetes.io/projected/a093e3ba-92f1-4c4d-b235-6409528c32c3-kube-api-access-gnvcn\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.092168 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-dns-svc\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.092167 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-sb\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.092559 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-nb\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.092901 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-config\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.113906 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnvcn\" (UniqueName: \"kubernetes.io/projected/a093e3ba-92f1-4c4d-b235-6409528c32c3-kube-api-access-gnvcn\") pod \"dnsmasq-dns-fff56db7f-bqr9r\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.274365 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.708752 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fff56db7f-bqr9r"] Oct 03 11:12:50 crc kubenswrapper[4990]: W1003 11:12:50.715082 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda093e3ba_92f1_4c4d_b235_6409528c32c3.slice/crio-97516b17b5a59f695a22bbf856d0febeb7a483f3c1d1dc21a61b60ba31bae80a WatchSource:0}: Error finding container 97516b17b5a59f695a22bbf856d0febeb7a483f3c1d1dc21a61b60ba31bae80a: Status 404 returned error can't find the container with id 97516b17b5a59f695a22bbf856d0febeb7a483f3c1d1dc21a61b60ba31bae80a Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.791765 4990 generic.go:334] "Generic (PLEG): container finished" podID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" containerID="e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5" exitCode=0 Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.791983 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" event={"ID":"15daae7e-fe8a-4e65-b975-67f1e9ba09f4","Type":"ContainerDied","Data":"e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5"} Oct 03 11:12:50 crc kubenswrapper[4990]: I1003 11:12:50.795369 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" event={"ID":"a093e3ba-92f1-4c4d-b235-6409528c32c3","Type":"ContainerStarted","Data":"97516b17b5a59f695a22bbf856d0febeb7a483f3c1d1dc21a61b60ba31bae80a"} Oct 03 11:12:51 crc kubenswrapper[4990]: I1003 11:12:51.810503 4990 generic.go:334] "Generic (PLEG): container finished" podID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerID="57f1fd2d131c7c55fa3da8fa6fd8c8784a7f06974761dc67aabc251389e14739" exitCode=0 Oct 03 11:12:51 crc kubenswrapper[4990]: I1003 11:12:51.810570 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" event={"ID":"a093e3ba-92f1-4c4d-b235-6409528c32c3","Type":"ContainerDied","Data":"57f1fd2d131c7c55fa3da8fa6fd8c8784a7f06974761dc67aabc251389e14739"} Oct 03 11:12:51 crc kubenswrapper[4990]: I1003 11:12:51.815480 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" event={"ID":"15daae7e-fe8a-4e65-b975-67f1e9ba09f4","Type":"ContainerStarted","Data":"6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899"} Oct 03 11:12:51 crc kubenswrapper[4990]: I1003 11:12:51.815682 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" podUID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" containerName="dnsmasq-dns" containerID="cri-o://6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899" gracePeriod=10 Oct 03 11:12:51 crc kubenswrapper[4990]: I1003 11:12:51.815764 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:51 crc kubenswrapper[4990]: I1003 11:12:51.878027 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" podStartSLOduration=3.878001673 podStartE2EDuration="3.878001673s" podCreationTimestamp="2025-10-03 11:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:12:51.857810961 +0000 UTC m=+5353.654442858" watchObservedRunningTime="2025-10-03 11:12:51.878001673 +0000 UTC m=+5353.674633540" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.257803 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.331961 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9jhs\" (UniqueName: \"kubernetes.io/projected/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-kube-api-access-g9jhs\") pod \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.332131 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-dns-svc\") pod \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.332153 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-config\") pod \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.332203 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-ovsdbserver-nb\") pod \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\" (UID: \"15daae7e-fe8a-4e65-b975-67f1e9ba09f4\") " Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.348753 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-kube-api-access-g9jhs" (OuterVolumeSpecName: "kube-api-access-g9jhs") pod "15daae7e-fe8a-4e65-b975-67f1e9ba09f4" (UID: "15daae7e-fe8a-4e65-b975-67f1e9ba09f4"). InnerVolumeSpecName "kube-api-access-g9jhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.369783 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15daae7e-fe8a-4e65-b975-67f1e9ba09f4" (UID: "15daae7e-fe8a-4e65-b975-67f1e9ba09f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.370572 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-config" (OuterVolumeSpecName: "config") pod "15daae7e-fe8a-4e65-b975-67f1e9ba09f4" (UID: "15daae7e-fe8a-4e65-b975-67f1e9ba09f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.371670 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "15daae7e-fe8a-4e65-b975-67f1e9ba09f4" (UID: "15daae7e-fe8a-4e65-b975-67f1e9ba09f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.434184 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9jhs\" (UniqueName: \"kubernetes.io/projected/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-kube-api-access-g9jhs\") on node \"crc\" DevicePath \"\"" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.434223 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.434235 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.434248 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15daae7e-fe8a-4e65-b975-67f1e9ba09f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.824421 4990 generic.go:334] "Generic (PLEG): container finished" podID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" containerID="6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899" exitCode=0 Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.824486 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" event={"ID":"15daae7e-fe8a-4e65-b975-67f1e9ba09f4","Type":"ContainerDied","Data":"6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899"} Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.824497 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.824561 4990 scope.go:117] "RemoveContainer" containerID="6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.824549 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb8648879-q2qjd" event={"ID":"15daae7e-fe8a-4e65-b975-67f1e9ba09f4","Type":"ContainerDied","Data":"cc9dbb403c324666eef056f3f52702c37aee41c97319d6c32ca9a762cec047f9"} Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.840614 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" event={"ID":"a093e3ba-92f1-4c4d-b235-6409528c32c3","Type":"ContainerStarted","Data":"26774592c48626448f932d8de38e128d2946d821756d409380b14325e928bb90"} Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.840787 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.851440 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb8648879-q2qjd"] Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.859048 4990 scope.go:117] "RemoveContainer" containerID="e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.864254 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cb8648879-q2qjd"] Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.873010 4990 scope.go:117] "RemoveContainer" containerID="6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899" Oct 03 11:12:52 crc kubenswrapper[4990]: E1003 11:12:52.873549 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899\": container with ID starting with 6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899 not found: ID does not exist" containerID="6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.873588 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899"} err="failed to get container status \"6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899\": rpc error: code = NotFound desc = could not find container \"6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899\": container with ID starting with 6ded3b6b989b6b899692445c3e9b76bbd3e39a5d4bf4d4b2e46932029e54a899 not found: ID does not exist" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.873615 4990 scope.go:117] "RemoveContainer" containerID="e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5" Oct 03 11:12:52 crc kubenswrapper[4990]: E1003 11:12:52.873898 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5\": container with ID starting with e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5 not found: ID does not exist" containerID="e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.873928 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5"} err="failed to get container status \"e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5\": rpc error: code = NotFound desc = could not find container \"e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5\": container with ID starting with e6630a3320d62b0f562ef78e3fee0bea62ce634d9230c16e400518fe7a7b3ff5 not found: ID does not exist" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.876917 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" podStartSLOduration=3.8768946619999998 podStartE2EDuration="3.876894662s" podCreationTimestamp="2025-10-03 11:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:12:52.873443935 +0000 UTC m=+5354.670075812" watchObservedRunningTime="2025-10-03 11:12:52.876894662 +0000 UTC m=+5354.673526519" Oct 03 11:12:52 crc kubenswrapper[4990]: I1003 11:12:52.880620 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" path="/var/lib/kubelet/pods/15daae7e-fe8a-4e65-b975-67f1e9ba09f4/volumes" Oct 03 11:12:53 crc kubenswrapper[4990]: I1003 11:12:53.500841 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.099093 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 03 11:12:56 crc kubenswrapper[4990]: E1003 11:12:56.099765 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" containerName="init" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.099780 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" containerName="init" Oct 03 11:12:56 crc kubenswrapper[4990]: E1003 11:12:56.099825 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" containerName="dnsmasq-dns" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.099832 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" containerName="dnsmasq-dns" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.100020 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="15daae7e-fe8a-4e65-b975-67f1e9ba09f4" containerName="dnsmasq-dns" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.100765 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.106923 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.121457 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.194238 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5241c94f-789d-4e82-84d1-8765ee56934a-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.194335 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b057a844-300d-44e0-9a41-622edc620945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.194451 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvs7f\" (UniqueName: \"kubernetes.io/projected/5241c94f-789d-4e82-84d1-8765ee56934a-kube-api-access-rvs7f\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.296157 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvs7f\" (UniqueName: \"kubernetes.io/projected/5241c94f-789d-4e82-84d1-8765ee56934a-kube-api-access-rvs7f\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.296327 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5241c94f-789d-4e82-84d1-8765ee56934a-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.296383 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b057a844-300d-44e0-9a41-622edc620945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.303892 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.303934 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b057a844-300d-44e0-9a41-622edc620945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7acdd3bda7c97b24be439676dba0ab80d483ee6ee655e39800b2be6f504dacd0/globalmount\"" pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.310939 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5241c94f-789d-4e82-84d1-8765ee56934a-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.324688 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvs7f\" (UniqueName: \"kubernetes.io/projected/5241c94f-789d-4e82-84d1-8765ee56934a-kube-api-access-rvs7f\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.338238 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b057a844-300d-44e0-9a41-622edc620945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945\") pod \"ovn-copy-data\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " pod="openstack/ovn-copy-data" Oct 03 11:12:56 crc kubenswrapper[4990]: I1003 11:12:56.471786 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 11:12:57 crc kubenswrapper[4990]: I1003 11:12:57.066443 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 11:12:57 crc kubenswrapper[4990]: I1003 11:12:57.067919 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 11:12:57 crc kubenswrapper[4990]: I1003 11:12:57.886914 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5241c94f-789d-4e82-84d1-8765ee56934a","Type":"ContainerStarted","Data":"7449e52444e458abc0f544390d150c62f721fb6e1e4e75cb66786effe49ad66a"} Oct 03 11:12:57 crc kubenswrapper[4990]: I1003 11:12:57.887282 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5241c94f-789d-4e82-84d1-8765ee56934a","Type":"ContainerStarted","Data":"00ba4025e9b1253112eb96c83b0fa1ba53902f2c32474c44697dc9845c0febb9"} Oct 03 11:12:57 crc kubenswrapper[4990]: I1003 11:12:57.907168 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.3378266659999998 podStartE2EDuration="2.907145736s" podCreationTimestamp="2025-10-03 11:12:55 +0000 UTC" firstStartedPulling="2025-10-03 11:12:57.067291308 +0000 UTC m=+5358.863923205" lastFinishedPulling="2025-10-03 11:12:57.636610418 +0000 UTC m=+5359.433242275" observedRunningTime="2025-10-03 11:12:57.901989555 +0000 UTC m=+5359.698621432" watchObservedRunningTime="2025-10-03 11:12:57.907145736 +0000 UTC m=+5359.703777603" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.275879 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.374709 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-669d466995-chhdd"] Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.375472 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-669d466995-chhdd" podUID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" containerName="dnsmasq-dns" containerID="cri-o://438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75" gracePeriod=10 Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.876429 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.917026 4990 generic.go:334] "Generic (PLEG): container finished" podID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" containerID="438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75" exitCode=0 Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.917073 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669d466995-chhdd" event={"ID":"77da3289-2b52-4bad-aed7-f4509c5ecc6e","Type":"ContainerDied","Data":"438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75"} Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.917106 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669d466995-chhdd" event={"ID":"77da3289-2b52-4bad-aed7-f4509c5ecc6e","Type":"ContainerDied","Data":"07aa6b22bcf191280296fa0602f05f810f51f18ca350fb14a97ec00acd0461cb"} Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.917128 4990 scope.go:117] "RemoveContainer" containerID="438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.917076 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669d466995-chhdd" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.938804 4990 scope.go:117] "RemoveContainer" containerID="6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.955833 4990 scope.go:117] "RemoveContainer" containerID="438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75" Oct 03 11:13:00 crc kubenswrapper[4990]: E1003 11:13:00.956380 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75\": container with ID starting with 438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75 not found: ID does not exist" containerID="438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.956430 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75"} err="failed to get container status \"438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75\": rpc error: code = NotFound desc = could not find container \"438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75\": container with ID starting with 438ff56dea65a211329934adb1fdeb4351f9e3c90b100a41f07c310831032e75 not found: ID does not exist" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.956462 4990 scope.go:117] "RemoveContainer" containerID="6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9" Oct 03 11:13:00 crc kubenswrapper[4990]: E1003 11:13:00.956823 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9\": container with ID starting with 6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9 not found: ID does not exist" containerID="6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.956859 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9"} err="failed to get container status \"6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9\": rpc error: code = NotFound desc = could not find container \"6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9\": container with ID starting with 6d6f138053d336b7c9eb246f277d12d487f6e8382ea0b96a2fd635d5512b40a9 not found: ID does not exist" Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.981435 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-config\") pod \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.981568 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnm4r\" (UniqueName: \"kubernetes.io/projected/77da3289-2b52-4bad-aed7-f4509c5ecc6e-kube-api-access-rnm4r\") pod \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.981704 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-dns-svc\") pod \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\" (UID: \"77da3289-2b52-4bad-aed7-f4509c5ecc6e\") " Oct 03 11:13:00 crc kubenswrapper[4990]: I1003 11:13:00.987352 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77da3289-2b52-4bad-aed7-f4509c5ecc6e-kube-api-access-rnm4r" (OuterVolumeSpecName: "kube-api-access-rnm4r") pod "77da3289-2b52-4bad-aed7-f4509c5ecc6e" (UID: "77da3289-2b52-4bad-aed7-f4509c5ecc6e"). InnerVolumeSpecName "kube-api-access-rnm4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:13:01 crc kubenswrapper[4990]: I1003 11:13:01.024134 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77da3289-2b52-4bad-aed7-f4509c5ecc6e" (UID: "77da3289-2b52-4bad-aed7-f4509c5ecc6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:13:01 crc kubenswrapper[4990]: I1003 11:13:01.040947 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-config" (OuterVolumeSpecName: "config") pod "77da3289-2b52-4bad-aed7-f4509c5ecc6e" (UID: "77da3289-2b52-4bad-aed7-f4509c5ecc6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:13:01 crc kubenswrapper[4990]: I1003 11:13:01.083431 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:01 crc kubenswrapper[4990]: I1003 11:13:01.083472 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnm4r\" (UniqueName: \"kubernetes.io/projected/77da3289-2b52-4bad-aed7-f4509c5ecc6e-kube-api-access-rnm4r\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:01 crc kubenswrapper[4990]: I1003 11:13:01.083485 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77da3289-2b52-4bad-aed7-f4509c5ecc6e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:01 crc kubenswrapper[4990]: I1003 11:13:01.251923 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-669d466995-chhdd"] Oct 03 11:13:01 crc kubenswrapper[4990]: I1003 11:13:01.260649 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-669d466995-chhdd"] Oct 03 11:13:02 crc kubenswrapper[4990]: E1003 11:13:02.526566 4990 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.146:56300->38.102.83.146:37319: write tcp 38.102.83.146:56300->38.102.83.146:37319: write: broken pipe Oct 03 11:13:02 crc kubenswrapper[4990]: I1003 11:13:02.884745 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" path="/var/lib/kubelet/pods/77da3289-2b52-4bad-aed7-f4509c5ecc6e/volumes" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.709404 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 11:13:03 crc kubenswrapper[4990]: E1003 11:13:03.711351 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" containerName="init" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.711457 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" containerName="init" Oct 03 11:13:03 crc kubenswrapper[4990]: E1003 11:13:03.711556 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" containerName="dnsmasq-dns" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.711647 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" containerName="dnsmasq-dns" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.711928 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="77da3289-2b52-4bad-aed7-f4509c5ecc6e" containerName="dnsmasq-dns" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.713172 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.715121 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4qbjf" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.715465 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.715662 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.718013 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.731064 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.857640 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bhkr\" (UniqueName: \"kubernetes.io/projected/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-kube-api-access-8bhkr\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.857929 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.857956 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.857983 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-scripts\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.858060 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-config\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.858077 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.858144 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.959296 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.959351 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bhkr\" (UniqueName: \"kubernetes.io/projected/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-kube-api-access-8bhkr\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.959721 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.959747 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.960457 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-scripts\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.960531 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-config\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.960550 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.961321 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-scripts\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.961659 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.961869 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-config\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.967121 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.967137 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.967773 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:03 crc kubenswrapper[4990]: I1003 11:13:03.976618 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bhkr\" (UniqueName: \"kubernetes.io/projected/ebc9ec16-e9c6-4e49-895e-9ff733ff125d-kube-api-access-8bhkr\") pod \"ovn-northd-0\" (UID: \"ebc9ec16-e9c6-4e49-895e-9ff733ff125d\") " pod="openstack/ovn-northd-0" Oct 03 11:13:04 crc kubenswrapper[4990]: I1003 11:13:04.038588 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 11:13:04 crc kubenswrapper[4990]: I1003 11:13:04.565327 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 11:13:04 crc kubenswrapper[4990]: W1003 11:13:04.570931 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebc9ec16_e9c6_4e49_895e_9ff733ff125d.slice/crio-9c5e8a9a504befaa0c7157f4b0a0b1f111399b41da2a5573d2d50e30f89bf910 WatchSource:0}: Error finding container 9c5e8a9a504befaa0c7157f4b0a0b1f111399b41da2a5573d2d50e30f89bf910: Status 404 returned error can't find the container with id 9c5e8a9a504befaa0c7157f4b0a0b1f111399b41da2a5573d2d50e30f89bf910 Oct 03 11:13:04 crc kubenswrapper[4990]: E1003 11:13:04.700688 4990 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.146:56330->38.102.83.146:37319: write tcp 38.102.83.146:56330->38.102.83.146:37319: write: broken pipe Oct 03 11:13:04 crc kubenswrapper[4990]: I1003 11:13:04.952811 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ebc9ec16-e9c6-4e49-895e-9ff733ff125d","Type":"ContainerStarted","Data":"46e71c805b3f637d4095d2a2692ad3ec6934b07aa05e2fd73c67f36aa2b99054"} Oct 03 11:13:04 crc kubenswrapper[4990]: I1003 11:13:04.952871 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ebc9ec16-e9c6-4e49-895e-9ff733ff125d","Type":"ContainerStarted","Data":"fd779791d46ed2eba82dd04ccc8c15364cf14b00e4fd2df09d7011dab19f14f0"} Oct 03 11:13:04 crc kubenswrapper[4990]: I1003 11:13:04.952887 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ebc9ec16-e9c6-4e49-895e-9ff733ff125d","Type":"ContainerStarted","Data":"9c5e8a9a504befaa0c7157f4b0a0b1f111399b41da2a5573d2d50e30f89bf910"} Oct 03 11:13:04 crc kubenswrapper[4990]: I1003 11:13:04.953008 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 11:13:04 crc kubenswrapper[4990]: I1003 11:13:04.984093 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.984068027 podStartE2EDuration="1.984068027s" podCreationTimestamp="2025-10-03 11:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:13:04.971876998 +0000 UTC m=+5366.768508875" watchObservedRunningTime="2025-10-03 11:13:04.984068027 +0000 UTC m=+5366.780699894" Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.134632 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-s6sbq"] Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.136397 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s6sbq" Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.144933 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s6sbq"] Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.256026 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78djh\" (UniqueName: \"kubernetes.io/projected/b17e9d0c-0fd4-4836-bf80-8c5b24d601bf-kube-api-access-78djh\") pod \"keystone-db-create-s6sbq\" (UID: \"b17e9d0c-0fd4-4836-bf80-8c5b24d601bf\") " pod="openstack/keystone-db-create-s6sbq" Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.357233 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78djh\" (UniqueName: \"kubernetes.io/projected/b17e9d0c-0fd4-4836-bf80-8c5b24d601bf-kube-api-access-78djh\") pod \"keystone-db-create-s6sbq\" (UID: \"b17e9d0c-0fd4-4836-bf80-8c5b24d601bf\") " pod="openstack/keystone-db-create-s6sbq" Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.375729 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78djh\" (UniqueName: \"kubernetes.io/projected/b17e9d0c-0fd4-4836-bf80-8c5b24d601bf-kube-api-access-78djh\") pod \"keystone-db-create-s6sbq\" (UID: \"b17e9d0c-0fd4-4836-bf80-8c5b24d601bf\") " pod="openstack/keystone-db-create-s6sbq" Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.499590 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s6sbq" Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.971124 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s6sbq"] Oct 03 11:13:09 crc kubenswrapper[4990]: I1003 11:13:09.997634 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s6sbq" event={"ID":"b17e9d0c-0fd4-4836-bf80-8c5b24d601bf","Type":"ContainerStarted","Data":"75c4f30bae62d5fa6ef734ac3f913525918ca5b79ad023e07d08b1281d9b0f16"} Oct 03 11:13:11 crc kubenswrapper[4990]: I1003 11:13:11.009945 4990 generic.go:334] "Generic (PLEG): container finished" podID="b17e9d0c-0fd4-4836-bf80-8c5b24d601bf" containerID="deb3141b384b64896688b9c16243935c80d59345a45c620c634d1e0fa2af6064" exitCode=0 Oct 03 11:13:11 crc kubenswrapper[4990]: I1003 11:13:11.010005 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s6sbq" event={"ID":"b17e9d0c-0fd4-4836-bf80-8c5b24d601bf","Type":"ContainerDied","Data":"deb3141b384b64896688b9c16243935c80d59345a45c620c634d1e0fa2af6064"} Oct 03 11:13:12 crc kubenswrapper[4990]: I1003 11:13:12.360350 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s6sbq" Oct 03 11:13:12 crc kubenswrapper[4990]: I1003 11:13:12.523251 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78djh\" (UniqueName: \"kubernetes.io/projected/b17e9d0c-0fd4-4836-bf80-8c5b24d601bf-kube-api-access-78djh\") pod \"b17e9d0c-0fd4-4836-bf80-8c5b24d601bf\" (UID: \"b17e9d0c-0fd4-4836-bf80-8c5b24d601bf\") " Oct 03 11:13:12 crc kubenswrapper[4990]: I1003 11:13:12.530197 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b17e9d0c-0fd4-4836-bf80-8c5b24d601bf-kube-api-access-78djh" (OuterVolumeSpecName: "kube-api-access-78djh") pod "b17e9d0c-0fd4-4836-bf80-8c5b24d601bf" (UID: "b17e9d0c-0fd4-4836-bf80-8c5b24d601bf"). InnerVolumeSpecName "kube-api-access-78djh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:13:12 crc kubenswrapper[4990]: I1003 11:13:12.625667 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78djh\" (UniqueName: \"kubernetes.io/projected/b17e9d0c-0fd4-4836-bf80-8c5b24d601bf-kube-api-access-78djh\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:13 crc kubenswrapper[4990]: I1003 11:13:13.033049 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s6sbq" Oct 03 11:13:13 crc kubenswrapper[4990]: I1003 11:13:13.032997 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s6sbq" event={"ID":"b17e9d0c-0fd4-4836-bf80-8c5b24d601bf","Type":"ContainerDied","Data":"75c4f30bae62d5fa6ef734ac3f913525918ca5b79ad023e07d08b1281d9b0f16"} Oct 03 11:13:13 crc kubenswrapper[4990]: I1003 11:13:13.033769 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75c4f30bae62d5fa6ef734ac3f913525918ca5b79ad023e07d08b1281d9b0f16" Oct 03 11:13:14 crc kubenswrapper[4990]: I1003 11:13:14.199722 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.155518 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d7bb-account-create-8slwx"] Oct 03 11:13:19 crc kubenswrapper[4990]: E1003 11:13:19.156404 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17e9d0c-0fd4-4836-bf80-8c5b24d601bf" containerName="mariadb-database-create" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.156422 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17e9d0c-0fd4-4836-bf80-8c5b24d601bf" containerName="mariadb-database-create" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.156669 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b17e9d0c-0fd4-4836-bf80-8c5b24d601bf" containerName="mariadb-database-create" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.157323 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7bb-account-create-8slwx" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.160951 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.175786 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d7bb-account-create-8slwx"] Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.355666 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wkrx\" (UniqueName: \"kubernetes.io/projected/71c41a7f-04c6-4f86-9502-b1e881ea0fb2-kube-api-access-9wkrx\") pod \"keystone-d7bb-account-create-8slwx\" (UID: \"71c41a7f-04c6-4f86-9502-b1e881ea0fb2\") " pod="openstack/keystone-d7bb-account-create-8slwx" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.458213 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wkrx\" (UniqueName: \"kubernetes.io/projected/71c41a7f-04c6-4f86-9502-b1e881ea0fb2-kube-api-access-9wkrx\") pod \"keystone-d7bb-account-create-8slwx\" (UID: \"71c41a7f-04c6-4f86-9502-b1e881ea0fb2\") " pod="openstack/keystone-d7bb-account-create-8slwx" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.478563 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wkrx\" (UniqueName: \"kubernetes.io/projected/71c41a7f-04c6-4f86-9502-b1e881ea0fb2-kube-api-access-9wkrx\") pod \"keystone-d7bb-account-create-8slwx\" (UID: \"71c41a7f-04c6-4f86-9502-b1e881ea0fb2\") " pod="openstack/keystone-d7bb-account-create-8slwx" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.478950 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7bb-account-create-8slwx" Oct 03 11:13:19 crc kubenswrapper[4990]: I1003 11:13:19.906215 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d7bb-account-create-8slwx"] Oct 03 11:13:20 crc kubenswrapper[4990]: I1003 11:13:20.109062 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7bb-account-create-8slwx" event={"ID":"71c41a7f-04c6-4f86-9502-b1e881ea0fb2","Type":"ContainerStarted","Data":"259cc44370d05973d8a599db38a9e1d374badf2fdce3be1dc295cc605628efab"} Oct 03 11:13:20 crc kubenswrapper[4990]: I1003 11:13:20.109108 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7bb-account-create-8slwx" event={"ID":"71c41a7f-04c6-4f86-9502-b1e881ea0fb2","Type":"ContainerStarted","Data":"fe9b2191b4d3173c8701489432ce656889a09b463ce6099932176c12900c5dd9"} Oct 03 11:13:20 crc kubenswrapper[4990]: I1003 11:13:20.125452 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d7bb-account-create-8slwx" podStartSLOduration=1.125430837 podStartE2EDuration="1.125430837s" podCreationTimestamp="2025-10-03 11:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:13:20.121968068 +0000 UTC m=+5381.918599935" watchObservedRunningTime="2025-10-03 11:13:20.125430837 +0000 UTC m=+5381.922062684" Oct 03 11:13:21 crc kubenswrapper[4990]: I1003 11:13:21.117951 4990 generic.go:334] "Generic (PLEG): container finished" podID="71c41a7f-04c6-4f86-9502-b1e881ea0fb2" containerID="259cc44370d05973d8a599db38a9e1d374badf2fdce3be1dc295cc605628efab" exitCode=0 Oct 03 11:13:21 crc kubenswrapper[4990]: I1003 11:13:21.118356 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7bb-account-create-8slwx" event={"ID":"71c41a7f-04c6-4f86-9502-b1e881ea0fb2","Type":"ContainerDied","Data":"259cc44370d05973d8a599db38a9e1d374badf2fdce3be1dc295cc605628efab"} Oct 03 11:13:22 crc kubenswrapper[4990]: I1003 11:13:22.497155 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7bb-account-create-8slwx" Oct 03 11:13:22 crc kubenswrapper[4990]: I1003 11:13:22.512941 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wkrx\" (UniqueName: \"kubernetes.io/projected/71c41a7f-04c6-4f86-9502-b1e881ea0fb2-kube-api-access-9wkrx\") pod \"71c41a7f-04c6-4f86-9502-b1e881ea0fb2\" (UID: \"71c41a7f-04c6-4f86-9502-b1e881ea0fb2\") " Oct 03 11:13:22 crc kubenswrapper[4990]: I1003 11:13:22.519671 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c41a7f-04c6-4f86-9502-b1e881ea0fb2-kube-api-access-9wkrx" (OuterVolumeSpecName: "kube-api-access-9wkrx") pod "71c41a7f-04c6-4f86-9502-b1e881ea0fb2" (UID: "71c41a7f-04c6-4f86-9502-b1e881ea0fb2"). InnerVolumeSpecName "kube-api-access-9wkrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:13:22 crc kubenswrapper[4990]: I1003 11:13:22.614500 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wkrx\" (UniqueName: \"kubernetes.io/projected/71c41a7f-04c6-4f86-9502-b1e881ea0fb2-kube-api-access-9wkrx\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:23 crc kubenswrapper[4990]: I1003 11:13:23.140462 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d7bb-account-create-8slwx" event={"ID":"71c41a7f-04c6-4f86-9502-b1e881ea0fb2","Type":"ContainerDied","Data":"fe9b2191b4d3173c8701489432ce656889a09b463ce6099932176c12900c5dd9"} Oct 03 11:13:23 crc kubenswrapper[4990]: I1003 11:13:23.140501 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe9b2191b4d3173c8701489432ce656889a09b463ce6099932176c12900c5dd9" Oct 03 11:13:23 crc kubenswrapper[4990]: I1003 11:13:23.140530 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d7bb-account-create-8slwx" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.543502 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xrc49"] Oct 03 11:13:24 crc kubenswrapper[4990]: E1003 11:13:24.544289 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c41a7f-04c6-4f86-9502-b1e881ea0fb2" containerName="mariadb-account-create" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.544311 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c41a7f-04c6-4f86-9502-b1e881ea0fb2" containerName="mariadb-account-create" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.544693 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c41a7f-04c6-4f86-9502-b1e881ea0fb2" containerName="mariadb-account-create" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.545390 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.550057 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.550256 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.550821 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.550967 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9jqpb" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.555464 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xrc49"] Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.645570 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-config-data\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.645624 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-combined-ca-bundle\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.645688 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9qr\" (UniqueName: \"kubernetes.io/projected/7407cb10-f324-46b2-b231-966df1553a10-kube-api-access-4m9qr\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.746938 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-config-data\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.747006 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-combined-ca-bundle\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.747057 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9qr\" (UniqueName: \"kubernetes.io/projected/7407cb10-f324-46b2-b231-966df1553a10-kube-api-access-4m9qr\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.757476 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-combined-ca-bundle\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.757630 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-config-data\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.771753 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9qr\" (UniqueName: \"kubernetes.io/projected/7407cb10-f324-46b2-b231-966df1553a10-kube-api-access-4m9qr\") pod \"keystone-db-sync-xrc49\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:24 crc kubenswrapper[4990]: I1003 11:13:24.867173 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:25 crc kubenswrapper[4990]: I1003 11:13:25.318179 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xrc49"] Oct 03 11:13:25 crc kubenswrapper[4990]: W1003 11:13:25.323674 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7407cb10_f324_46b2_b231_966df1553a10.slice/crio-edf5980e88f6035ac129487bf165bd291d8670578a2afd59339715b7c6f92de2 WatchSource:0}: Error finding container edf5980e88f6035ac129487bf165bd291d8670578a2afd59339715b7c6f92de2: Status 404 returned error can't find the container with id edf5980e88f6035ac129487bf165bd291d8670578a2afd59339715b7c6f92de2 Oct 03 11:13:26 crc kubenswrapper[4990]: I1003 11:13:26.168283 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xrc49" event={"ID":"7407cb10-f324-46b2-b231-966df1553a10","Type":"ContainerStarted","Data":"471e0fd6b86a5fcf70dce1249d3fd6e7fb9b0ce35f8fed1b07bc14cd18ddb277"} Oct 03 11:13:26 crc kubenswrapper[4990]: I1003 11:13:26.169219 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xrc49" event={"ID":"7407cb10-f324-46b2-b231-966df1553a10","Type":"ContainerStarted","Data":"edf5980e88f6035ac129487bf165bd291d8670578a2afd59339715b7c6f92de2"} Oct 03 11:13:26 crc kubenswrapper[4990]: I1003 11:13:26.205268 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xrc49" podStartSLOduration=2.205239263 podStartE2EDuration="2.205239263s" podCreationTimestamp="2025-10-03 11:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:13:26.194831566 +0000 UTC m=+5387.991463423" watchObservedRunningTime="2025-10-03 11:13:26.205239263 +0000 UTC m=+5388.001871160" Oct 03 11:13:27 crc kubenswrapper[4990]: I1003 11:13:27.180844 4990 generic.go:334] "Generic (PLEG): container finished" podID="7407cb10-f324-46b2-b231-966df1553a10" containerID="471e0fd6b86a5fcf70dce1249d3fd6e7fb9b0ce35f8fed1b07bc14cd18ddb277" exitCode=0 Oct 03 11:13:27 crc kubenswrapper[4990]: I1003 11:13:27.180910 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xrc49" event={"ID":"7407cb10-f324-46b2-b231-966df1553a10","Type":"ContainerDied","Data":"471e0fd6b86a5fcf70dce1249d3fd6e7fb9b0ce35f8fed1b07bc14cd18ddb277"} Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.596374 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.717670 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-combined-ca-bundle\") pod \"7407cb10-f324-46b2-b231-966df1553a10\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.717733 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m9qr\" (UniqueName: \"kubernetes.io/projected/7407cb10-f324-46b2-b231-966df1553a10-kube-api-access-4m9qr\") pod \"7407cb10-f324-46b2-b231-966df1553a10\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.717815 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-config-data\") pod \"7407cb10-f324-46b2-b231-966df1553a10\" (UID: \"7407cb10-f324-46b2-b231-966df1553a10\") " Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.724346 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7407cb10-f324-46b2-b231-966df1553a10-kube-api-access-4m9qr" (OuterVolumeSpecName: "kube-api-access-4m9qr") pod "7407cb10-f324-46b2-b231-966df1553a10" (UID: "7407cb10-f324-46b2-b231-966df1553a10"). InnerVolumeSpecName "kube-api-access-4m9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.743077 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7407cb10-f324-46b2-b231-966df1553a10" (UID: "7407cb10-f324-46b2-b231-966df1553a10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.762225 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-config-data" (OuterVolumeSpecName: "config-data") pod "7407cb10-f324-46b2-b231-966df1553a10" (UID: "7407cb10-f324-46b2-b231-966df1553a10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.821067 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.821137 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m9qr\" (UniqueName: \"kubernetes.io/projected/7407cb10-f324-46b2-b231-966df1553a10-kube-api-access-4m9qr\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:28 crc kubenswrapper[4990]: I1003 11:13:28.821156 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7407cb10-f324-46b2-b231-966df1553a10-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.201417 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xrc49" event={"ID":"7407cb10-f324-46b2-b231-966df1553a10","Type":"ContainerDied","Data":"edf5980e88f6035ac129487bf165bd291d8670578a2afd59339715b7c6f92de2"} Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.201761 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edf5980e88f6035ac129487bf165bd291d8670578a2afd59339715b7c6f92de2" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.201493 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xrc49" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.441014 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f54884d97-gblkk"] Oct 03 11:13:29 crc kubenswrapper[4990]: E1003 11:13:29.441471 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7407cb10-f324-46b2-b231-966df1553a10" containerName="keystone-db-sync" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.441488 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7407cb10-f324-46b2-b231-966df1553a10" containerName="keystone-db-sync" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.441735 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7407cb10-f324-46b2-b231-966df1553a10" containerName="keystone-db-sync" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.445417 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.472172 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f54884d97-gblkk"] Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.513660 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s2bkg"] Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.515108 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.520081 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.520106 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.520183 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9jqpb" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.520115 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.522272 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s2bkg"] Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.549492 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-config-data\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.549637 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjxds\" (UniqueName: \"kubernetes.io/projected/6e17142f-c80c-41e5-b8e8-e496437cd396-kube-api-access-fjxds\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.549692 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42mb\" (UniqueName: \"kubernetes.io/projected/2ea090cc-48fe-4692-9b04-61efe1cf1770-kube-api-access-k42mb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.549871 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-fernet-keys\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.550043 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-dns-svc\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.550087 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-sb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.550160 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-credential-keys\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.550191 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-nb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.550237 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-combined-ca-bundle\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.550267 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-scripts\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.550296 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-config\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.652828 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-fernet-keys\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.653015 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-dns-svc\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.654422 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-dns-svc\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.654756 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-sb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.656329 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-sb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.656425 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-credential-keys\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.657018 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-nb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.657097 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-combined-ca-bundle\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.657129 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-scripts\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.657604 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-config\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.657639 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-config-data\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.657748 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjxds\" (UniqueName: \"kubernetes.io/projected/6e17142f-c80c-41e5-b8e8-e496437cd396-kube-api-access-fjxds\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.657799 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-nb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.657819 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k42mb\" (UniqueName: \"kubernetes.io/projected/2ea090cc-48fe-4692-9b04-61efe1cf1770-kube-api-access-k42mb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.660622 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-credential-keys\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.660841 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-config-data\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.661370 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-combined-ca-bundle\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.661873 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-scripts\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.662095 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-fernet-keys\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.662231 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-config\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.681237 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42mb\" (UniqueName: \"kubernetes.io/projected/2ea090cc-48fe-4692-9b04-61efe1cf1770-kube-api-access-k42mb\") pod \"dnsmasq-dns-7f54884d97-gblkk\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.681388 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjxds\" (UniqueName: \"kubernetes.io/projected/6e17142f-c80c-41e5-b8e8-e496437cd396-kube-api-access-fjxds\") pod \"keystone-bootstrap-s2bkg\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.770026 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:29 crc kubenswrapper[4990]: I1003 11:13:29.834746 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:30 crc kubenswrapper[4990]: I1003 11:13:30.303526 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f54884d97-gblkk"] Oct 03 11:13:30 crc kubenswrapper[4990]: I1003 11:13:30.348344 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s2bkg"] Oct 03 11:13:30 crc kubenswrapper[4990]: W1003 11:13:30.355139 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e17142f_c80c_41e5_b8e8_e496437cd396.slice/crio-800a32355d6c7e07cca7562636bdcff6f91c102b819d5af49b94a89b9283e555 WatchSource:0}: Error finding container 800a32355d6c7e07cca7562636bdcff6f91c102b819d5af49b94a89b9283e555: Status 404 returned error can't find the container with id 800a32355d6c7e07cca7562636bdcff6f91c102b819d5af49b94a89b9283e555 Oct 03 11:13:31 crc kubenswrapper[4990]: I1003 11:13:31.230828 4990 generic.go:334] "Generic (PLEG): container finished" podID="2ea090cc-48fe-4692-9b04-61efe1cf1770" containerID="d4300692e3248b61c74ddd02d2a9e8c4bda6d45502ca21a5220710480195ffd8" exitCode=0 Oct 03 11:13:31 crc kubenswrapper[4990]: I1003 11:13:31.230966 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" event={"ID":"2ea090cc-48fe-4692-9b04-61efe1cf1770","Type":"ContainerDied","Data":"d4300692e3248b61c74ddd02d2a9e8c4bda6d45502ca21a5220710480195ffd8"} Oct 03 11:13:31 crc kubenswrapper[4990]: I1003 11:13:31.231355 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" event={"ID":"2ea090cc-48fe-4692-9b04-61efe1cf1770","Type":"ContainerStarted","Data":"827201c32bf15e0241dc27a21327750ec8a04e4eda2a0d46833a4cf836b6e4de"} Oct 03 11:13:31 crc kubenswrapper[4990]: I1003 11:13:31.239225 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s2bkg" event={"ID":"6e17142f-c80c-41e5-b8e8-e496437cd396","Type":"ContainerStarted","Data":"7e4359c799ab7a94d3ed20038926338225291d8029ceabfe4f9bec7fbbe204cc"} Oct 03 11:13:31 crc kubenswrapper[4990]: I1003 11:13:31.239584 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s2bkg" event={"ID":"6e17142f-c80c-41e5-b8e8-e496437cd396","Type":"ContainerStarted","Data":"800a32355d6c7e07cca7562636bdcff6f91c102b819d5af49b94a89b9283e555"} Oct 03 11:13:31 crc kubenswrapper[4990]: I1003 11:13:31.282867 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s2bkg" podStartSLOduration=2.2827921509999998 podStartE2EDuration="2.282792151s" podCreationTimestamp="2025-10-03 11:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:13:31.268040553 +0000 UTC m=+5393.064672440" watchObservedRunningTime="2025-10-03 11:13:31.282792151 +0000 UTC m=+5393.079424048" Oct 03 11:13:32 crc kubenswrapper[4990]: I1003 11:13:32.250151 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" event={"ID":"2ea090cc-48fe-4692-9b04-61efe1cf1770","Type":"ContainerStarted","Data":"d2cb415e5d02e9652aa908423c69f346346639185ea583bd0af35904f4ca4402"} Oct 03 11:13:32 crc kubenswrapper[4990]: I1003 11:13:32.250422 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:32 crc kubenswrapper[4990]: I1003 11:13:32.301618 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" podStartSLOduration=3.301598334 podStartE2EDuration="3.301598334s" podCreationTimestamp="2025-10-03 11:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:13:32.298219587 +0000 UTC m=+5394.094851434" watchObservedRunningTime="2025-10-03 11:13:32.301598334 +0000 UTC m=+5394.098230191" Oct 03 11:13:34 crc kubenswrapper[4990]: I1003 11:13:34.271355 4990 generic.go:334] "Generic (PLEG): container finished" podID="6e17142f-c80c-41e5-b8e8-e496437cd396" containerID="7e4359c799ab7a94d3ed20038926338225291d8029ceabfe4f9bec7fbbe204cc" exitCode=0 Oct 03 11:13:34 crc kubenswrapper[4990]: I1003 11:13:34.271463 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s2bkg" event={"ID":"6e17142f-c80c-41e5-b8e8-e496437cd396","Type":"ContainerDied","Data":"7e4359c799ab7a94d3ed20038926338225291d8029ceabfe4f9bec7fbbe204cc"} Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.684050 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.876205 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-scripts\") pod \"6e17142f-c80c-41e5-b8e8-e496437cd396\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.876459 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-config-data\") pod \"6e17142f-c80c-41e5-b8e8-e496437cd396\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.876560 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-credential-keys\") pod \"6e17142f-c80c-41e5-b8e8-e496437cd396\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.876614 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-fernet-keys\") pod \"6e17142f-c80c-41e5-b8e8-e496437cd396\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.876661 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-combined-ca-bundle\") pod \"6e17142f-c80c-41e5-b8e8-e496437cd396\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.876739 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjxds\" (UniqueName: \"kubernetes.io/projected/6e17142f-c80c-41e5-b8e8-e496437cd396-kube-api-access-fjxds\") pod \"6e17142f-c80c-41e5-b8e8-e496437cd396\" (UID: \"6e17142f-c80c-41e5-b8e8-e496437cd396\") " Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.882738 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e17142f-c80c-41e5-b8e8-e496437cd396-kube-api-access-fjxds" (OuterVolumeSpecName: "kube-api-access-fjxds") pod "6e17142f-c80c-41e5-b8e8-e496437cd396" (UID: "6e17142f-c80c-41e5-b8e8-e496437cd396"). InnerVolumeSpecName "kube-api-access-fjxds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.883542 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6e17142f-c80c-41e5-b8e8-e496437cd396" (UID: "6e17142f-c80c-41e5-b8e8-e496437cd396"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.884325 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-scripts" (OuterVolumeSpecName: "scripts") pod "6e17142f-c80c-41e5-b8e8-e496437cd396" (UID: "6e17142f-c80c-41e5-b8e8-e496437cd396"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.884604 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6e17142f-c80c-41e5-b8e8-e496437cd396" (UID: "6e17142f-c80c-41e5-b8e8-e496437cd396"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.902890 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-config-data" (OuterVolumeSpecName: "config-data") pod "6e17142f-c80c-41e5-b8e8-e496437cd396" (UID: "6e17142f-c80c-41e5-b8e8-e496437cd396"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.921298 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e17142f-c80c-41e5-b8e8-e496437cd396" (UID: "6e17142f-c80c-41e5-b8e8-e496437cd396"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.978800 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjxds\" (UniqueName: \"kubernetes.io/projected/6e17142f-c80c-41e5-b8e8-e496437cd396-kube-api-access-fjxds\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.978833 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.978842 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.978850 4990 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.978859 4990 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:35 crc kubenswrapper[4990]: I1003 11:13:35.978868 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e17142f-c80c-41e5-b8e8-e496437cd396-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.292118 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s2bkg" event={"ID":"6e17142f-c80c-41e5-b8e8-e496437cd396","Type":"ContainerDied","Data":"800a32355d6c7e07cca7562636bdcff6f91c102b819d5af49b94a89b9283e555"} Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.292201 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800a32355d6c7e07cca7562636bdcff6f91c102b819d5af49b94a89b9283e555" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.292216 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s2bkg" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.482186 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s2bkg"] Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.494462 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s2bkg"] Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.577190 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bbj4x"] Oct 03 11:13:36 crc kubenswrapper[4990]: E1003 11:13:36.577745 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e17142f-c80c-41e5-b8e8-e496437cd396" containerName="keystone-bootstrap" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.577770 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e17142f-c80c-41e5-b8e8-e496437cd396" containerName="keystone-bootstrap" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.578130 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e17142f-c80c-41e5-b8e8-e496437cd396" containerName="keystone-bootstrap" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.579353 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.615292 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.615388 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.615969 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.616255 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9jqpb" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.619040 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bbj4x"] Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.716785 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26tlp\" (UniqueName: \"kubernetes.io/projected/338090dc-62dd-40a5-b506-df026387f291-kube-api-access-26tlp\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.717494 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-combined-ca-bundle\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.717592 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-credential-keys\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.717671 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-config-data\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.717699 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-fernet-keys\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.717727 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-scripts\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.819349 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-config-data\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.819397 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-fernet-keys\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.819421 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-scripts\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.819565 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26tlp\" (UniqueName: \"kubernetes.io/projected/338090dc-62dd-40a5-b506-df026387f291-kube-api-access-26tlp\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.819604 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-combined-ca-bundle\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.819625 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-credential-keys\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.823943 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-credential-keys\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.824111 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-config-data\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.826833 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-combined-ca-bundle\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.826862 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-scripts\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.828713 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-fernet-keys\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.840144 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26tlp\" (UniqueName: \"kubernetes.io/projected/338090dc-62dd-40a5-b506-df026387f291-kube-api-access-26tlp\") pod \"keystone-bootstrap-bbj4x\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.894058 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e17142f-c80c-41e5-b8e8-e496437cd396" path="/var/lib/kubelet/pods/6e17142f-c80c-41e5-b8e8-e496437cd396/volumes" Oct 03 11:13:36 crc kubenswrapper[4990]: I1003 11:13:36.937495 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:37 crc kubenswrapper[4990]: I1003 11:13:37.416747 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bbj4x"] Oct 03 11:13:38 crc kubenswrapper[4990]: I1003 11:13:38.324479 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbj4x" event={"ID":"338090dc-62dd-40a5-b506-df026387f291","Type":"ContainerStarted","Data":"c0676911ca071fa924ef341ee2fa511f140639eeef1c840153ab8e60d4397635"} Oct 03 11:13:38 crc kubenswrapper[4990]: I1003 11:13:38.324545 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbj4x" event={"ID":"338090dc-62dd-40a5-b506-df026387f291","Type":"ContainerStarted","Data":"aa785f3e3f674157ab25a402cefd79d0fafff753e3f471c25a754482ed5bf2bf"} Oct 03 11:13:38 crc kubenswrapper[4990]: I1003 11:13:38.357355 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bbj4x" podStartSLOduration=2.357336153 podStartE2EDuration="2.357336153s" podCreationTimestamp="2025-10-03 11:13:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:13:38.356953463 +0000 UTC m=+5400.153585320" watchObservedRunningTime="2025-10-03 11:13:38.357336153 +0000 UTC m=+5400.153968010" Oct 03 11:13:39 crc kubenswrapper[4990]: I1003 11:13:39.771761 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:13:39 crc kubenswrapper[4990]: I1003 11:13:39.836069 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fff56db7f-bqr9r"] Oct 03 11:13:39 crc kubenswrapper[4990]: I1003 11:13:39.836422 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" podUID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerName="dnsmasq-dns" containerID="cri-o://26774592c48626448f932d8de38e128d2946d821756d409380b14325e928bb90" gracePeriod=10 Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.344272 4990 generic.go:334] "Generic (PLEG): container finished" podID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerID="26774592c48626448f932d8de38e128d2946d821756d409380b14325e928bb90" exitCode=0 Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.344363 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" event={"ID":"a093e3ba-92f1-4c4d-b235-6409528c32c3","Type":"ContainerDied","Data":"26774592c48626448f932d8de38e128d2946d821756d409380b14325e928bb90"} Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.344553 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" event={"ID":"a093e3ba-92f1-4c4d-b235-6409528c32c3","Type":"ContainerDied","Data":"97516b17b5a59f695a22bbf856d0febeb7a483f3c1d1dc21a61b60ba31bae80a"} Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.344565 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97516b17b5a59f695a22bbf856d0febeb7a483f3c1d1dc21a61b60ba31bae80a" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.388861 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.399150 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnvcn\" (UniqueName: \"kubernetes.io/projected/a093e3ba-92f1-4c4d-b235-6409528c32c3-kube-api-access-gnvcn\") pod \"a093e3ba-92f1-4c4d-b235-6409528c32c3\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.399204 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-config\") pod \"a093e3ba-92f1-4c4d-b235-6409528c32c3\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.399257 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-dns-svc\") pod \"a093e3ba-92f1-4c4d-b235-6409528c32c3\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.399343 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-nb\") pod \"a093e3ba-92f1-4c4d-b235-6409528c32c3\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.399416 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-sb\") pod \"a093e3ba-92f1-4c4d-b235-6409528c32c3\" (UID: \"a093e3ba-92f1-4c4d-b235-6409528c32c3\") " Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.405446 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a093e3ba-92f1-4c4d-b235-6409528c32c3-kube-api-access-gnvcn" (OuterVolumeSpecName: "kube-api-access-gnvcn") pod "a093e3ba-92f1-4c4d-b235-6409528c32c3" (UID: "a093e3ba-92f1-4c4d-b235-6409528c32c3"). InnerVolumeSpecName "kube-api-access-gnvcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.445922 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a093e3ba-92f1-4c4d-b235-6409528c32c3" (UID: "a093e3ba-92f1-4c4d-b235-6409528c32c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.450314 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-config" (OuterVolumeSpecName: "config") pod "a093e3ba-92f1-4c4d-b235-6409528c32c3" (UID: "a093e3ba-92f1-4c4d-b235-6409528c32c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.462130 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a093e3ba-92f1-4c4d-b235-6409528c32c3" (UID: "a093e3ba-92f1-4c4d-b235-6409528c32c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.473559 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a093e3ba-92f1-4c4d-b235-6409528c32c3" (UID: "a093e3ba-92f1-4c4d-b235-6409528c32c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.500858 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.500884 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.500894 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnvcn\" (UniqueName: \"kubernetes.io/projected/a093e3ba-92f1-4c4d-b235-6409528c32c3-kube-api-access-gnvcn\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.500907 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:40 crc kubenswrapper[4990]: I1003 11:13:40.500916 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a093e3ba-92f1-4c4d-b235-6409528c32c3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:41 crc kubenswrapper[4990]: I1003 11:13:41.353662 4990 generic.go:334] "Generic (PLEG): container finished" podID="338090dc-62dd-40a5-b506-df026387f291" containerID="c0676911ca071fa924ef341ee2fa511f140639eeef1c840153ab8e60d4397635" exitCode=0 Oct 03 11:13:41 crc kubenswrapper[4990]: I1003 11:13:41.353793 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbj4x" event={"ID":"338090dc-62dd-40a5-b506-df026387f291","Type":"ContainerDied","Data":"c0676911ca071fa924ef341ee2fa511f140639eeef1c840153ab8e60d4397635"} Oct 03 11:13:41 crc kubenswrapper[4990]: I1003 11:13:41.354089 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" Oct 03 11:13:41 crc kubenswrapper[4990]: I1003 11:13:41.403413 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fff56db7f-bqr9r"] Oct 03 11:13:41 crc kubenswrapper[4990]: I1003 11:13:41.410606 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fff56db7f-bqr9r"] Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.774837 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.852822 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-combined-ca-bundle\") pod \"338090dc-62dd-40a5-b506-df026387f291\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.852866 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26tlp\" (UniqueName: \"kubernetes.io/projected/338090dc-62dd-40a5-b506-df026387f291-kube-api-access-26tlp\") pod \"338090dc-62dd-40a5-b506-df026387f291\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.852919 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-credential-keys\") pod \"338090dc-62dd-40a5-b506-df026387f291\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.852945 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-scripts\") pod \"338090dc-62dd-40a5-b506-df026387f291\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.853008 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-fernet-keys\") pod \"338090dc-62dd-40a5-b506-df026387f291\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.853057 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-config-data\") pod \"338090dc-62dd-40a5-b506-df026387f291\" (UID: \"338090dc-62dd-40a5-b506-df026387f291\") " Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.859339 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "338090dc-62dd-40a5-b506-df026387f291" (UID: "338090dc-62dd-40a5-b506-df026387f291"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.859376 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "338090dc-62dd-40a5-b506-df026387f291" (UID: "338090dc-62dd-40a5-b506-df026387f291"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.860003 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/338090dc-62dd-40a5-b506-df026387f291-kube-api-access-26tlp" (OuterVolumeSpecName: "kube-api-access-26tlp") pod "338090dc-62dd-40a5-b506-df026387f291" (UID: "338090dc-62dd-40a5-b506-df026387f291"). InnerVolumeSpecName "kube-api-access-26tlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.861141 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-scripts" (OuterVolumeSpecName: "scripts") pod "338090dc-62dd-40a5-b506-df026387f291" (UID: "338090dc-62dd-40a5-b506-df026387f291"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.884129 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-config-data" (OuterVolumeSpecName: "config-data") pod "338090dc-62dd-40a5-b506-df026387f291" (UID: "338090dc-62dd-40a5-b506-df026387f291"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.885439 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a093e3ba-92f1-4c4d-b235-6409528c32c3" path="/var/lib/kubelet/pods/a093e3ba-92f1-4c4d-b235-6409528c32c3/volumes" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.888941 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "338090dc-62dd-40a5-b506-df026387f291" (UID: "338090dc-62dd-40a5-b506-df026387f291"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.954619 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.954651 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.954664 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26tlp\" (UniqueName: \"kubernetes.io/projected/338090dc-62dd-40a5-b506-df026387f291-kube-api-access-26tlp\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.954676 4990 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.954689 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:42 crc kubenswrapper[4990]: I1003 11:13:42.954702 4990 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/338090dc-62dd-40a5-b506-df026387f291-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.374708 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbj4x" event={"ID":"338090dc-62dd-40a5-b506-df026387f291","Type":"ContainerDied","Data":"aa785f3e3f674157ab25a402cefd79d0fafff753e3f471c25a754482ed5bf2bf"} Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.375058 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa785f3e3f674157ab25a402cefd79d0fafff753e3f471c25a754482ed5bf2bf" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.375130 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbj4x" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.465952 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8459bb745d-tcmjm"] Oct 03 11:13:43 crc kubenswrapper[4990]: E1003 11:13:43.466300 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerName="dnsmasq-dns" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.466313 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerName="dnsmasq-dns" Oct 03 11:13:43 crc kubenswrapper[4990]: E1003 11:13:43.466334 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338090dc-62dd-40a5-b506-df026387f291" containerName="keystone-bootstrap" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.466340 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="338090dc-62dd-40a5-b506-df026387f291" containerName="keystone-bootstrap" Oct 03 11:13:43 crc kubenswrapper[4990]: E1003 11:13:43.466347 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerName="init" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.466353 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerName="init" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.466534 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerName="dnsmasq-dns" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.466547 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="338090dc-62dd-40a5-b506-df026387f291" containerName="keystone-bootstrap" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.467104 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.476355 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9jqpb" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.476805 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.477085 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.477249 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.477358 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.477489 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.484822 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8459bb745d-tcmjm"] Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.563894 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-internal-tls-certs\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.563987 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-credential-keys\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.564018 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrh79\" (UniqueName: \"kubernetes.io/projected/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-kube-api-access-hrh79\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.564044 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-fernet-keys\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.564074 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-config-data\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.564095 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-combined-ca-bundle\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.564130 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-scripts\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.564153 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-public-tls-certs\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.665298 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-internal-tls-certs\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.665369 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-credential-keys\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.665391 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrh79\" (UniqueName: \"kubernetes.io/projected/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-kube-api-access-hrh79\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.665410 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-fernet-keys\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.665437 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-config-data\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.665459 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-combined-ca-bundle\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.665496 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-scripts\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.666119 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-public-tls-certs\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.670447 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-public-tls-certs\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.670496 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-fernet-keys\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.673814 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-scripts\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.674258 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-combined-ca-bundle\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.674278 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-internal-tls-certs\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.675837 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-config-data\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.675894 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-credential-keys\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.685430 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrh79\" (UniqueName: \"kubernetes.io/projected/1b2e7a2e-0ed1-4004-a487-24b50c0c956d-kube-api-access-hrh79\") pod \"keystone-8459bb745d-tcmjm\" (UID: \"1b2e7a2e-0ed1-4004-a487-24b50c0c956d\") " pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:43 crc kubenswrapper[4990]: I1003 11:13:43.799398 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:44 crc kubenswrapper[4990]: I1003 11:13:44.257915 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8459bb745d-tcmjm"] Oct 03 11:13:44 crc kubenswrapper[4990]: I1003 11:13:44.387298 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8459bb745d-tcmjm" event={"ID":"1b2e7a2e-0ed1-4004-a487-24b50c0c956d","Type":"ContainerStarted","Data":"d943acaea7f12fe0f5e099dcea34d9f1cfb7bc1a82f114bf98a61cb3ef190a80"} Oct 03 11:13:45 crc kubenswrapper[4990]: I1003 11:13:45.279162 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fff56db7f-bqr9r" podUID="a093e3ba-92f1-4c4d-b235-6409528c32c3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.15:5353: i/o timeout" Oct 03 11:13:45 crc kubenswrapper[4990]: I1003 11:13:45.400644 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8459bb745d-tcmjm" event={"ID":"1b2e7a2e-0ed1-4004-a487-24b50c0c956d","Type":"ContainerStarted","Data":"fbaf9bf7be9cf5ff6354314ca888e6ecca5304ca8f82256a8ee60a276e451d23"} Oct 03 11:13:45 crc kubenswrapper[4990]: I1003 11:13:45.400848 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:13:45 crc kubenswrapper[4990]: I1003 11:13:45.423987 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8459bb745d-tcmjm" podStartSLOduration=2.423967251 podStartE2EDuration="2.423967251s" podCreationTimestamp="2025-10-03 11:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:13:45.41611297 +0000 UTC m=+5407.212744827" watchObservedRunningTime="2025-10-03 11:13:45.423967251 +0000 UTC m=+5407.220599108" Oct 03 11:14:15 crc kubenswrapper[4990]: I1003 11:14:15.311947 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8459bb745d-tcmjm" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.211731 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.213557 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.215770 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.215771 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-vhmgm" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.219054 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.255239 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.283186 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 11:14:19 crc kubenswrapper[4990]: E1003 11:14:19.283780 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-7n2f8 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-7n2f8 openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="27b82f19-a743-46ed-8514-5e8ae40aa938" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.297778 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.314891 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.316035 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.323963 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.443404 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-combined-ca-bundle\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.443474 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76zcm\" (UniqueName: \"kubernetes.io/projected/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-kube-api-access-76zcm\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.443546 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.443650 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config-secret\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.545257 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-combined-ca-bundle\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.545575 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76zcm\" (UniqueName: \"kubernetes.io/projected/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-kube-api-access-76zcm\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.545760 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.545903 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config-secret\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.547823 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.555585 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-combined-ca-bundle\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.571666 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config-secret\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.575258 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76zcm\" (UniqueName: \"kubernetes.io/projected/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-kube-api-access-76zcm\") pod \"openstackclient\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.636305 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.732244 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.782701 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="27b82f19-a743-46ed-8514-5e8ae40aa938" podUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" Oct 03 11:14:19 crc kubenswrapper[4990]: I1003 11:14:19.788223 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:14:20 crc kubenswrapper[4990]: I1003 11:14:20.140572 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 11:14:20 crc kubenswrapper[4990]: W1003 11:14:20.142678 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17e9d7f7_f069_4491_bd0e_f40b43ed51fe.slice/crio-88cb5069906c47eb52f297ce637b21f3b34a6cc9371e870091cb02717d204a37 WatchSource:0}: Error finding container 88cb5069906c47eb52f297ce637b21f3b34a6cc9371e870091cb02717d204a37: Status 404 returned error can't find the container with id 88cb5069906c47eb52f297ce637b21f3b34a6cc9371e870091cb02717d204a37 Oct 03 11:14:20 crc kubenswrapper[4990]: I1003 11:14:20.743910 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:14:20 crc kubenswrapper[4990]: I1003 11:14:20.744181 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"17e9d7f7-f069-4491-bd0e-f40b43ed51fe","Type":"ContainerStarted","Data":"eee9ddb4b22ed62d57a21ad6d7caf3839300771b294daf1bc986afbb0bee418a"} Oct 03 11:14:20 crc kubenswrapper[4990]: I1003 11:14:20.744228 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"17e9d7f7-f069-4491-bd0e-f40b43ed51fe","Type":"ContainerStarted","Data":"88cb5069906c47eb52f297ce637b21f3b34a6cc9371e870091cb02717d204a37"} Oct 03 11:14:20 crc kubenswrapper[4990]: I1003 11:14:20.748544 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="27b82f19-a743-46ed-8514-5e8ae40aa938" podUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" Oct 03 11:14:20 crc kubenswrapper[4990]: I1003 11:14:20.767637 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.767619411 podStartE2EDuration="1.767619411s" podCreationTimestamp="2025-10-03 11:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:14:20.766467152 +0000 UTC m=+5442.563099019" watchObservedRunningTime="2025-10-03 11:14:20.767619411 +0000 UTC m=+5442.564251268" Oct 03 11:14:20 crc kubenswrapper[4990]: I1003 11:14:20.882852 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b82f19-a743-46ed-8514-5e8ae40aa938" path="/var/lib/kubelet/pods/27b82f19-a743-46ed-8514-5e8ae40aa938/volumes" Oct 03 11:14:25 crc kubenswrapper[4990]: I1003 11:14:25.303937 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:14:25 crc kubenswrapper[4990]: I1003 11:14:25.304571 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:14:48 crc kubenswrapper[4990]: I1003 11:14:48.060196 4990 scope.go:117] "RemoveContainer" containerID="8ec4a268ed3978f68bf390d5b10546e128d801c4fe6383a17f5e33a8f41d7640" Oct 03 11:14:48 crc kubenswrapper[4990]: I1003 11:14:48.100883 4990 scope.go:117] "RemoveContainer" containerID="d362c750436ca42753dff65b5c57c1595909ad8e2a00546631fd1dc7abf93161" Oct 03 11:14:48 crc kubenswrapper[4990]: I1003 11:14:48.151219 4990 scope.go:117] "RemoveContainer" containerID="0588746fd1b3d51705868bf75254477c3bb78e258f976d1a27de508c9663f03b" Oct 03 11:14:48 crc kubenswrapper[4990]: I1003 11:14:48.196063 4990 scope.go:117] "RemoveContainer" containerID="f78d79c994dda2476a8980fa680ff8b50a8dc2c46cb238f932c7a586c373d319" Oct 03 11:14:55 crc kubenswrapper[4990]: I1003 11:14:55.303999 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:14:55 crc kubenswrapper[4990]: I1003 11:14:55.304730 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.160430 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s"] Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.162415 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.170589 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s"] Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.171704 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.171896 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.267174 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab766225-b91b-4563-94fd-6b5a679ec419-config-volume\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.267221 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62b2r\" (UniqueName: \"kubernetes.io/projected/ab766225-b91b-4563-94fd-6b5a679ec419-kube-api-access-62b2r\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.267617 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab766225-b91b-4563-94fd-6b5a679ec419-secret-volume\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.369368 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab766225-b91b-4563-94fd-6b5a679ec419-secret-volume\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.369457 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab766225-b91b-4563-94fd-6b5a679ec419-config-volume\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.369489 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62b2r\" (UniqueName: \"kubernetes.io/projected/ab766225-b91b-4563-94fd-6b5a679ec419-kube-api-access-62b2r\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.370745 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab766225-b91b-4563-94fd-6b5a679ec419-config-volume\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.375166 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab766225-b91b-4563-94fd-6b5a679ec419-secret-volume\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.393923 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62b2r\" (UniqueName: \"kubernetes.io/projected/ab766225-b91b-4563-94fd-6b5a679ec419-kube-api-access-62b2r\") pod \"collect-profiles-29324835-76f4s\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.482867 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:00 crc kubenswrapper[4990]: I1003 11:15:00.908933 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s"] Oct 03 11:15:01 crc kubenswrapper[4990]: I1003 11:15:01.174499 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" event={"ID":"ab766225-b91b-4563-94fd-6b5a679ec419","Type":"ContainerStarted","Data":"485690513bc470ca3e1e6af02639d1407d43dd91380520a3b247a7ce9e4ab225"} Oct 03 11:15:01 crc kubenswrapper[4990]: I1003 11:15:01.174567 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" event={"ID":"ab766225-b91b-4563-94fd-6b5a679ec419","Type":"ContainerStarted","Data":"c6c5dcef0d79f000c6e338410d2d22e141d81c7d578c8cbf4e49e6a40d440be1"} Oct 03 11:15:01 crc kubenswrapper[4990]: I1003 11:15:01.194623 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" podStartSLOduration=1.194608438 podStartE2EDuration="1.194608438s" podCreationTimestamp="2025-10-03 11:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:15:01.19042458 +0000 UTC m=+5482.987056447" watchObservedRunningTime="2025-10-03 11:15:01.194608438 +0000 UTC m=+5482.991240295" Oct 03 11:15:02 crc kubenswrapper[4990]: I1003 11:15:02.184429 4990 generic.go:334] "Generic (PLEG): container finished" podID="ab766225-b91b-4563-94fd-6b5a679ec419" containerID="485690513bc470ca3e1e6af02639d1407d43dd91380520a3b247a7ce9e4ab225" exitCode=0 Oct 03 11:15:02 crc kubenswrapper[4990]: I1003 11:15:02.184479 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" event={"ID":"ab766225-b91b-4563-94fd-6b5a679ec419","Type":"ContainerDied","Data":"485690513bc470ca3e1e6af02639d1407d43dd91380520a3b247a7ce9e4ab225"} Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.527681 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.623047 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab766225-b91b-4563-94fd-6b5a679ec419-config-volume\") pod \"ab766225-b91b-4563-94fd-6b5a679ec419\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.623125 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62b2r\" (UniqueName: \"kubernetes.io/projected/ab766225-b91b-4563-94fd-6b5a679ec419-kube-api-access-62b2r\") pod \"ab766225-b91b-4563-94fd-6b5a679ec419\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.623269 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab766225-b91b-4563-94fd-6b5a679ec419-secret-volume\") pod \"ab766225-b91b-4563-94fd-6b5a679ec419\" (UID: \"ab766225-b91b-4563-94fd-6b5a679ec419\") " Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.623749 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab766225-b91b-4563-94fd-6b5a679ec419-config-volume" (OuterVolumeSpecName: "config-volume") pod "ab766225-b91b-4563-94fd-6b5a679ec419" (UID: "ab766225-b91b-4563-94fd-6b5a679ec419"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.623987 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab766225-b91b-4563-94fd-6b5a679ec419-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.628708 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab766225-b91b-4563-94fd-6b5a679ec419-kube-api-access-62b2r" (OuterVolumeSpecName: "kube-api-access-62b2r") pod "ab766225-b91b-4563-94fd-6b5a679ec419" (UID: "ab766225-b91b-4563-94fd-6b5a679ec419"). InnerVolumeSpecName "kube-api-access-62b2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.630295 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab766225-b91b-4563-94fd-6b5a679ec419-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ab766225-b91b-4563-94fd-6b5a679ec419" (UID: "ab766225-b91b-4563-94fd-6b5a679ec419"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.725693 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62b2r\" (UniqueName: \"kubernetes.io/projected/ab766225-b91b-4563-94fd-6b5a679ec419-kube-api-access-62b2r\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:03 crc kubenswrapper[4990]: I1003 11:15:03.726066 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab766225-b91b-4563-94fd-6b5a679ec419-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:04 crc kubenswrapper[4990]: I1003 11:15:04.214314 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" event={"ID":"ab766225-b91b-4563-94fd-6b5a679ec419","Type":"ContainerDied","Data":"c6c5dcef0d79f000c6e338410d2d22e141d81c7d578c8cbf4e49e6a40d440be1"} Oct 03 11:15:04 crc kubenswrapper[4990]: I1003 11:15:04.214399 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c5dcef0d79f000c6e338410d2d22e141d81c7d578c8cbf4e49e6a40d440be1" Oct 03 11:15:04 crc kubenswrapper[4990]: I1003 11:15:04.214597 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s" Oct 03 11:15:04 crc kubenswrapper[4990]: I1003 11:15:04.283014 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6"] Oct 03 11:15:04 crc kubenswrapper[4990]: I1003 11:15:04.290858 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324790-hzhh6"] Oct 03 11:15:04 crc kubenswrapper[4990]: I1003 11:15:04.884158 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d9d2a9-a20c-4589-9f8d-016e0c66141f" path="/var/lib/kubelet/pods/a1d9d2a9-a20c-4589-9f8d-016e0c66141f/volumes" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.190763 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9flf2"] Oct 03 11:15:16 crc kubenswrapper[4990]: E1003 11:15:16.191688 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab766225-b91b-4563-94fd-6b5a679ec419" containerName="collect-profiles" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.191701 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab766225-b91b-4563-94fd-6b5a679ec419" containerName="collect-profiles" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.191862 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab766225-b91b-4563-94fd-6b5a679ec419" containerName="collect-profiles" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.193094 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.198288 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9flf2"] Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.250426 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgm7\" (UniqueName: \"kubernetes.io/projected/b3401d65-8d6a-4941-bde4-cb2840148f29-kube-api-access-xcgm7\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.250885 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-catalog-content\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.251163 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-utilities\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.353272 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-utilities\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.353788 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgm7\" (UniqueName: \"kubernetes.io/projected/b3401d65-8d6a-4941-bde4-cb2840148f29-kube-api-access-xcgm7\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.353868 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-utilities\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.355562 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-catalog-content\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.356009 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-catalog-content\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.376397 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgm7\" (UniqueName: \"kubernetes.io/projected/b3401d65-8d6a-4941-bde4-cb2840148f29-kube-api-access-xcgm7\") pod \"certified-operators-9flf2\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:16 crc kubenswrapper[4990]: I1003 11:15:16.518018 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.071935 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9flf2"] Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.322490 4990 generic.go:334] "Generic (PLEG): container finished" podID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerID="cc9937becac5e30f9543a676fcd6b82dbc592adf3208015cebb49dbb3e51c1f2" exitCode=0 Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.322586 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flf2" event={"ID":"b3401d65-8d6a-4941-bde4-cb2840148f29","Type":"ContainerDied","Data":"cc9937becac5e30f9543a676fcd6b82dbc592adf3208015cebb49dbb3e51c1f2"} Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.322635 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flf2" event={"ID":"b3401d65-8d6a-4941-bde4-cb2840148f29","Type":"ContainerStarted","Data":"c691d3bb2dbf4361b7dfae7eba1b6450a763f12e338d63cc9a2c9f86aad64627"} Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.389235 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rwsk7"] Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.392560 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.402984 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwsk7"] Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.475554 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-catalog-content\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.475639 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmr7h\" (UniqueName: \"kubernetes.io/projected/c4221292-a56c-4751-b423-8a5111b27b91-kube-api-access-lmr7h\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.475712 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-utilities\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.576885 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmr7h\" (UniqueName: \"kubernetes.io/projected/c4221292-a56c-4751-b423-8a5111b27b91-kube-api-access-lmr7h\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.576995 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-utilities\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.577116 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-catalog-content\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.577699 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-catalog-content\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.577721 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-utilities\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.607245 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmr7h\" (UniqueName: \"kubernetes.io/projected/c4221292-a56c-4751-b423-8a5111b27b91-kube-api-access-lmr7h\") pod \"community-operators-rwsk7\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:17 crc kubenswrapper[4990]: I1003 11:15:17.714792 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:18 crc kubenswrapper[4990]: I1003 11:15:18.184312 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwsk7"] Oct 03 11:15:18 crc kubenswrapper[4990]: W1003 11:15:18.187246 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4221292_a56c_4751_b423_8a5111b27b91.slice/crio-7d6a7efa727aed74f4465a264b7ebb6a040f86b14c98ee5680a260d87409abc8 WatchSource:0}: Error finding container 7d6a7efa727aed74f4465a264b7ebb6a040f86b14c98ee5680a260d87409abc8: Status 404 returned error can't find the container with id 7d6a7efa727aed74f4465a264b7ebb6a040f86b14c98ee5680a260d87409abc8 Oct 03 11:15:18 crc kubenswrapper[4990]: I1003 11:15:18.331087 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flf2" event={"ID":"b3401d65-8d6a-4941-bde4-cb2840148f29","Type":"ContainerStarted","Data":"8a36404289c0db5d4dd3b4b3e7b7d8c20d72b11ff53ce54f0e62edb865c121f8"} Oct 03 11:15:18 crc kubenswrapper[4990]: I1003 11:15:18.332858 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwsk7" event={"ID":"c4221292-a56c-4751-b423-8a5111b27b91","Type":"ContainerStarted","Data":"7d6a7efa727aed74f4465a264b7ebb6a040f86b14c98ee5680a260d87409abc8"} Oct 03 11:15:19 crc kubenswrapper[4990]: I1003 11:15:19.342488 4990 generic.go:334] "Generic (PLEG): container finished" podID="c4221292-a56c-4751-b423-8a5111b27b91" containerID="ce9973279e392bf1d3264abf9bd2d1823d104168f3143812f537fd24b70ee2e0" exitCode=0 Oct 03 11:15:19 crc kubenswrapper[4990]: I1003 11:15:19.342576 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwsk7" event={"ID":"c4221292-a56c-4751-b423-8a5111b27b91","Type":"ContainerDied","Data":"ce9973279e392bf1d3264abf9bd2d1823d104168f3143812f537fd24b70ee2e0"} Oct 03 11:15:19 crc kubenswrapper[4990]: I1003 11:15:19.345445 4990 generic.go:334] "Generic (PLEG): container finished" podID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerID="8a36404289c0db5d4dd3b4b3e7b7d8c20d72b11ff53ce54f0e62edb865c121f8" exitCode=0 Oct 03 11:15:19 crc kubenswrapper[4990]: I1003 11:15:19.345481 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flf2" event={"ID":"b3401d65-8d6a-4941-bde4-cb2840148f29","Type":"ContainerDied","Data":"8a36404289c0db5d4dd3b4b3e7b7d8c20d72b11ff53ce54f0e62edb865c121f8"} Oct 03 11:15:20 crc kubenswrapper[4990]: I1003 11:15:20.355476 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flf2" event={"ID":"b3401d65-8d6a-4941-bde4-cb2840148f29","Type":"ContainerStarted","Data":"75dfe36e67f2a06f48f8f8e7b0a7f36ed317a7338f386823df48b6f5ea0d297b"} Oct 03 11:15:20 crc kubenswrapper[4990]: I1003 11:15:20.385048 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9flf2" podStartSLOduration=1.853358947 podStartE2EDuration="4.384985218s" podCreationTimestamp="2025-10-03 11:15:16 +0000 UTC" firstStartedPulling="2025-10-03 11:15:17.324713052 +0000 UTC m=+5499.121344909" lastFinishedPulling="2025-10-03 11:15:19.856339283 +0000 UTC m=+5501.652971180" observedRunningTime="2025-10-03 11:15:20.381346255 +0000 UTC m=+5502.177978122" watchObservedRunningTime="2025-10-03 11:15:20.384985218 +0000 UTC m=+5502.181617075" Oct 03 11:15:21 crc kubenswrapper[4990]: I1003 11:15:21.370118 4990 generic.go:334] "Generic (PLEG): container finished" podID="c4221292-a56c-4751-b423-8a5111b27b91" containerID="546d73db120b08f4ba62e9d0ceb77f013d502b1e81001f8701db0d7f74aaa7dd" exitCode=0 Oct 03 11:15:21 crc kubenswrapper[4990]: I1003 11:15:21.370270 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwsk7" event={"ID":"c4221292-a56c-4751-b423-8a5111b27b91","Type":"ContainerDied","Data":"546d73db120b08f4ba62e9d0ceb77f013d502b1e81001f8701db0d7f74aaa7dd"} Oct 03 11:15:22 crc kubenswrapper[4990]: I1003 11:15:22.380576 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwsk7" event={"ID":"c4221292-a56c-4751-b423-8a5111b27b91","Type":"ContainerStarted","Data":"62be4a19c4dc7cb13d064b89cb5a7b0d7aca046c5c72b58d721cf64d9f3a4473"} Oct 03 11:15:23 crc kubenswrapper[4990]: I1003 11:15:23.411874 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rwsk7" podStartSLOduration=3.854373874 podStartE2EDuration="6.411858707s" podCreationTimestamp="2025-10-03 11:15:17 +0000 UTC" firstStartedPulling="2025-10-03 11:15:19.345081745 +0000 UTC m=+5501.141713602" lastFinishedPulling="2025-10-03 11:15:21.902566578 +0000 UTC m=+5503.699198435" observedRunningTime="2025-10-03 11:15:23.411109738 +0000 UTC m=+5505.207741605" watchObservedRunningTime="2025-10-03 11:15:23.411858707 +0000 UTC m=+5505.208490564" Oct 03 11:15:25 crc kubenswrapper[4990]: I1003 11:15:25.304328 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:15:25 crc kubenswrapper[4990]: I1003 11:15:25.304802 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:15:25 crc kubenswrapper[4990]: I1003 11:15:25.304896 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:15:25 crc kubenswrapper[4990]: I1003 11:15:25.305993 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45f44d1104d8a5f414e2f55b32a55a6324ac5627c3c486e3590389dd674e36eb"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:15:25 crc kubenswrapper[4990]: I1003 11:15:25.306141 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://45f44d1104d8a5f414e2f55b32a55a6324ac5627c3c486e3590389dd674e36eb" gracePeriod=600 Oct 03 11:15:26 crc kubenswrapper[4990]: I1003 11:15:26.414468 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="45f44d1104d8a5f414e2f55b32a55a6324ac5627c3c486e3590389dd674e36eb" exitCode=0 Oct 03 11:15:26 crc kubenswrapper[4990]: I1003 11:15:26.414705 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"45f44d1104d8a5f414e2f55b32a55a6324ac5627c3c486e3590389dd674e36eb"} Oct 03 11:15:26 crc kubenswrapper[4990]: I1003 11:15:26.414831 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16"} Oct 03 11:15:26 crc kubenswrapper[4990]: I1003 11:15:26.414866 4990 scope.go:117] "RemoveContainer" containerID="d5d9c383dc10942c1b1e2000a52e35c434b4372e885918b265fea6c1e52691b7" Oct 03 11:15:26 crc kubenswrapper[4990]: I1003 11:15:26.518845 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:26 crc kubenswrapper[4990]: I1003 11:15:26.518978 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:26 crc kubenswrapper[4990]: I1003 11:15:26.606809 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:27 crc kubenswrapper[4990]: I1003 11:15:27.482386 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:27 crc kubenswrapper[4990]: I1003 11:15:27.716324 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:27 crc kubenswrapper[4990]: I1003 11:15:27.716741 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:27 crc kubenswrapper[4990]: I1003 11:15:27.779465 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:28 crc kubenswrapper[4990]: I1003 11:15:28.173107 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9flf2"] Oct 03 11:15:28 crc kubenswrapper[4990]: I1003 11:15:28.511216 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:29 crc kubenswrapper[4990]: I1003 11:15:29.447910 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9flf2" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerName="registry-server" containerID="cri-o://75dfe36e67f2a06f48f8f8e7b0a7f36ed317a7338f386823df48b6f5ea0d297b" gracePeriod=2 Oct 03 11:15:30 crc kubenswrapper[4990]: I1003 11:15:30.773716 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwsk7"] Oct 03 11:15:30 crc kubenswrapper[4990]: I1003 11:15:30.774550 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rwsk7" podUID="c4221292-a56c-4751-b423-8a5111b27b91" containerName="registry-server" containerID="cri-o://62be4a19c4dc7cb13d064b89cb5a7b0d7aca046c5c72b58d721cf64d9f3a4473" gracePeriod=2 Oct 03 11:15:31 crc kubenswrapper[4990]: I1003 11:15:31.469626 4990 generic.go:334] "Generic (PLEG): container finished" podID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerID="75dfe36e67f2a06f48f8f8e7b0a7f36ed317a7338f386823df48b6f5ea0d297b" exitCode=0 Oct 03 11:15:31 crc kubenswrapper[4990]: I1003 11:15:31.469681 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flf2" event={"ID":"b3401d65-8d6a-4941-bde4-cb2840148f29","Type":"ContainerDied","Data":"75dfe36e67f2a06f48f8f8e7b0a7f36ed317a7338f386823df48b6f5ea0d297b"} Oct 03 11:15:31 crc kubenswrapper[4990]: I1003 11:15:31.826637 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:31 crc kubenswrapper[4990]: I1003 11:15:31.895919 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgm7\" (UniqueName: \"kubernetes.io/projected/b3401d65-8d6a-4941-bde4-cb2840148f29-kube-api-access-xcgm7\") pod \"b3401d65-8d6a-4941-bde4-cb2840148f29\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " Oct 03 11:15:31 crc kubenswrapper[4990]: I1003 11:15:31.901495 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3401d65-8d6a-4941-bde4-cb2840148f29-kube-api-access-xcgm7" (OuterVolumeSpecName: "kube-api-access-xcgm7") pod "b3401d65-8d6a-4941-bde4-cb2840148f29" (UID: "b3401d65-8d6a-4941-bde4-cb2840148f29"). InnerVolumeSpecName "kube-api-access-xcgm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.000013 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-catalog-content\") pod \"b3401d65-8d6a-4941-bde4-cb2840148f29\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.000085 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-utilities\") pod \"b3401d65-8d6a-4941-bde4-cb2840148f29\" (UID: \"b3401d65-8d6a-4941-bde4-cb2840148f29\") " Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.000616 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgm7\" (UniqueName: \"kubernetes.io/projected/b3401d65-8d6a-4941-bde4-cb2840148f29-kube-api-access-xcgm7\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.001565 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-utilities" (OuterVolumeSpecName: "utilities") pod "b3401d65-8d6a-4941-bde4-cb2840148f29" (UID: "b3401d65-8d6a-4941-bde4-cb2840148f29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.102653 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.479203 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flf2" event={"ID":"b3401d65-8d6a-4941-bde4-cb2840148f29","Type":"ContainerDied","Data":"c691d3bb2dbf4361b7dfae7eba1b6450a763f12e338d63cc9a2c9f86aad64627"} Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.479568 4990 scope.go:117] "RemoveContainer" containerID="75dfe36e67f2a06f48f8f8e7b0a7f36ed317a7338f386823df48b6f5ea0d297b" Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.479274 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9flf2" Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.513406 4990 scope.go:117] "RemoveContainer" containerID="8a36404289c0db5d4dd3b4b3e7b7d8c20d72b11ff53ce54f0e62edb865c121f8" Oct 03 11:15:32 crc kubenswrapper[4990]: I1003 11:15:32.539915 4990 scope.go:117] "RemoveContainer" containerID="cc9937becac5e30f9543a676fcd6b82dbc592adf3208015cebb49dbb3e51c1f2" Oct 03 11:15:34 crc kubenswrapper[4990]: I1003 11:15:34.516849 4990 generic.go:334] "Generic (PLEG): container finished" podID="c4221292-a56c-4751-b423-8a5111b27b91" containerID="62be4a19c4dc7cb13d064b89cb5a7b0d7aca046c5c72b58d721cf64d9f3a4473" exitCode=0 Oct 03 11:15:34 crc kubenswrapper[4990]: I1003 11:15:34.516903 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwsk7" event={"ID":"c4221292-a56c-4751-b423-8a5111b27b91","Type":"ContainerDied","Data":"62be4a19c4dc7cb13d064b89cb5a7b0d7aca046c5c72b58d721cf64d9f3a4473"} Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.673131 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-79nsj"] Oct 03 11:15:35 crc kubenswrapper[4990]: E1003 11:15:35.673856 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerName="extract-content" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.673872 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerName="extract-content" Oct 03 11:15:35 crc kubenswrapper[4990]: E1003 11:15:35.673885 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerName="extract-utilities" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.673892 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerName="extract-utilities" Oct 03 11:15:35 crc kubenswrapper[4990]: E1003 11:15:35.673912 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerName="registry-server" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.673918 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerName="registry-server" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.674076 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" containerName="registry-server" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.674618 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-79nsj" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.689050 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-79nsj"] Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.696007 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47trf\" (UniqueName: \"kubernetes.io/projected/9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3-kube-api-access-47trf\") pod \"barbican-db-create-79nsj\" (UID: \"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3\") " pod="openstack/barbican-db-create-79nsj" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.798022 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47trf\" (UniqueName: \"kubernetes.io/projected/9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3-kube-api-access-47trf\") pod \"barbican-db-create-79nsj\" (UID: \"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3\") " pod="openstack/barbican-db-create-79nsj" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.819194 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47trf\" (UniqueName: \"kubernetes.io/projected/9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3-kube-api-access-47trf\") pod \"barbican-db-create-79nsj\" (UID: \"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3\") " pod="openstack/barbican-db-create-79nsj" Oct 03 11:15:35 crc kubenswrapper[4990]: I1003 11:15:35.989822 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-79nsj" Oct 03 11:15:36 crc kubenswrapper[4990]: I1003 11:15:36.449157 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-79nsj"] Oct 03 11:15:36 crc kubenswrapper[4990]: I1003 11:15:36.532196 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-79nsj" event={"ID":"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3","Type":"ContainerStarted","Data":"67232ebf48ba70632ede552888836949ba5177533afe76d80a28c403a6c25a7e"} Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.540422 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwsk7" event={"ID":"c4221292-a56c-4751-b423-8a5111b27b91","Type":"ContainerDied","Data":"7d6a7efa727aed74f4465a264b7ebb6a040f86b14c98ee5680a260d87409abc8"} Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.541845 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6a7efa727aed74f4465a264b7ebb6a040f86b14c98ee5680a260d87409abc8" Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.574027 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.642127 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmr7h\" (UniqueName: \"kubernetes.io/projected/c4221292-a56c-4751-b423-8a5111b27b91-kube-api-access-lmr7h\") pod \"c4221292-a56c-4751-b423-8a5111b27b91\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.642680 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-utilities\") pod \"c4221292-a56c-4751-b423-8a5111b27b91\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.642885 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-catalog-content\") pod \"c4221292-a56c-4751-b423-8a5111b27b91\" (UID: \"c4221292-a56c-4751-b423-8a5111b27b91\") " Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.643540 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-utilities" (OuterVolumeSpecName: "utilities") pod "c4221292-a56c-4751-b423-8a5111b27b91" (UID: "c4221292-a56c-4751-b423-8a5111b27b91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.649485 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4221292-a56c-4751-b423-8a5111b27b91-kube-api-access-lmr7h" (OuterVolumeSpecName: "kube-api-access-lmr7h") pod "c4221292-a56c-4751-b423-8a5111b27b91" (UID: "c4221292-a56c-4751-b423-8a5111b27b91"). InnerVolumeSpecName "kube-api-access-lmr7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.744312 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:37 crc kubenswrapper[4990]: I1003 11:15:37.744348 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmr7h\" (UniqueName: \"kubernetes.io/projected/c4221292-a56c-4751-b423-8a5111b27b91-kube-api-access-lmr7h\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:38 crc kubenswrapper[4990]: I1003 11:15:38.547339 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwsk7" Oct 03 11:15:38 crc kubenswrapper[4990]: I1003 11:15:38.754656 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3401d65-8d6a-4941-bde4-cb2840148f29" (UID: "b3401d65-8d6a-4941-bde4-cb2840148f29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:15:38 crc kubenswrapper[4990]: I1003 11:15:38.763049 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3401d65-8d6a-4941-bde4-cb2840148f29-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:38 crc kubenswrapper[4990]: I1003 11:15:38.840627 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9flf2"] Oct 03 11:15:38 crc kubenswrapper[4990]: I1003 11:15:38.850137 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9flf2"] Oct 03 11:15:38 crc kubenswrapper[4990]: I1003 11:15:38.883587 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3401d65-8d6a-4941-bde4-cb2840148f29" path="/var/lib/kubelet/pods/b3401d65-8d6a-4941-bde4-cb2840148f29/volumes" Oct 03 11:15:39 crc kubenswrapper[4990]: I1003 11:15:39.554964 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-79nsj" event={"ID":"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3","Type":"ContainerStarted","Data":"5508dd7652b00f7f994143650fab7a1b975be2dd4c9cfcd4737414eafbe938cb"} Oct 03 11:15:40 crc kubenswrapper[4990]: I1003 11:15:40.418092 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4221292-a56c-4751-b423-8a5111b27b91" (UID: "c4221292-a56c-4751-b423-8a5111b27b91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:15:40 crc kubenswrapper[4990]: I1003 11:15:40.493584 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4221292-a56c-4751-b423-8a5111b27b91-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:40 crc kubenswrapper[4990]: I1003 11:15:40.566005 4990 generic.go:334] "Generic (PLEG): container finished" podID="9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3" containerID="5508dd7652b00f7f994143650fab7a1b975be2dd4c9cfcd4737414eafbe938cb" exitCode=0 Oct 03 11:15:40 crc kubenswrapper[4990]: I1003 11:15:40.566050 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-79nsj" event={"ID":"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3","Type":"ContainerDied","Data":"5508dd7652b00f7f994143650fab7a1b975be2dd4c9cfcd4737414eafbe938cb"} Oct 03 11:15:40 crc kubenswrapper[4990]: I1003 11:15:40.707058 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwsk7"] Oct 03 11:15:40 crc kubenswrapper[4990]: I1003 11:15:40.713479 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rwsk7"] Oct 03 11:15:40 crc kubenswrapper[4990]: I1003 11:15:40.881136 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4221292-a56c-4751-b423-8a5111b27b91" path="/var/lib/kubelet/pods/c4221292-a56c-4751-b423-8a5111b27b91/volumes" Oct 03 11:15:41 crc kubenswrapper[4990]: I1003 11:15:41.930915 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-79nsj" Oct 03 11:15:42 crc kubenswrapper[4990]: I1003 11:15:42.023531 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47trf\" (UniqueName: \"kubernetes.io/projected/9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3-kube-api-access-47trf\") pod \"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3\" (UID: \"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3\") " Oct 03 11:15:42 crc kubenswrapper[4990]: I1003 11:15:42.029684 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3-kube-api-access-47trf" (OuterVolumeSpecName: "kube-api-access-47trf") pod "9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3" (UID: "9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3"). InnerVolumeSpecName "kube-api-access-47trf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:15:42 crc kubenswrapper[4990]: I1003 11:15:42.126232 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47trf\" (UniqueName: \"kubernetes.io/projected/9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3-kube-api-access-47trf\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:42 crc kubenswrapper[4990]: I1003 11:15:42.585969 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-79nsj" event={"ID":"9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3","Type":"ContainerDied","Data":"67232ebf48ba70632ede552888836949ba5177533afe76d80a28c403a6c25a7e"} Oct 03 11:15:42 crc kubenswrapper[4990]: I1003 11:15:42.586009 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-79nsj" Oct 03 11:15:42 crc kubenswrapper[4990]: I1003 11:15:42.586023 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67232ebf48ba70632ede552888836949ba5177533afe76d80a28c403a6c25a7e" Oct 03 11:15:48 crc kubenswrapper[4990]: I1003 11:15:48.280779 4990 scope.go:117] "RemoveContainer" containerID="f3b15d0020cea5147559c9a3f562c1bf7cd08efba88ec32c78a3e331800e0efe" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.822741 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bc6c-account-create-zcz47"] Oct 03 11:15:55 crc kubenswrapper[4990]: E1003 11:15:55.823764 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4221292-a56c-4751-b423-8a5111b27b91" containerName="extract-utilities" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.823781 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4221292-a56c-4751-b423-8a5111b27b91" containerName="extract-utilities" Oct 03 11:15:55 crc kubenswrapper[4990]: E1003 11:15:55.823799 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4221292-a56c-4751-b423-8a5111b27b91" containerName="extract-content" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.823807 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4221292-a56c-4751-b423-8a5111b27b91" containerName="extract-content" Oct 03 11:15:55 crc kubenswrapper[4990]: E1003 11:15:55.823827 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4221292-a56c-4751-b423-8a5111b27b91" containerName="registry-server" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.823835 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4221292-a56c-4751-b423-8a5111b27b91" containerName="registry-server" Oct 03 11:15:55 crc kubenswrapper[4990]: E1003 11:15:55.823853 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3" containerName="mariadb-database-create" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.823861 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3" containerName="mariadb-database-create" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.824045 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3" containerName="mariadb-database-create" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.824068 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4221292-a56c-4751-b423-8a5111b27b91" containerName="registry-server" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.824739 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bc6c-account-create-zcz47" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.827683 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.842927 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bc6c-account-create-zcz47"] Oct 03 11:15:55 crc kubenswrapper[4990]: I1003 11:15:55.978501 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hknbw\" (UniqueName: \"kubernetes.io/projected/146d3ad2-f23d-47b9-a404-8b048b59c8fc-kube-api-access-hknbw\") pod \"barbican-bc6c-account-create-zcz47\" (UID: \"146d3ad2-f23d-47b9-a404-8b048b59c8fc\") " pod="openstack/barbican-bc6c-account-create-zcz47" Oct 03 11:15:56 crc kubenswrapper[4990]: I1003 11:15:56.080403 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hknbw\" (UniqueName: \"kubernetes.io/projected/146d3ad2-f23d-47b9-a404-8b048b59c8fc-kube-api-access-hknbw\") pod \"barbican-bc6c-account-create-zcz47\" (UID: \"146d3ad2-f23d-47b9-a404-8b048b59c8fc\") " pod="openstack/barbican-bc6c-account-create-zcz47" Oct 03 11:15:56 crc kubenswrapper[4990]: I1003 11:15:56.113802 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hknbw\" (UniqueName: \"kubernetes.io/projected/146d3ad2-f23d-47b9-a404-8b048b59c8fc-kube-api-access-hknbw\") pod \"barbican-bc6c-account-create-zcz47\" (UID: \"146d3ad2-f23d-47b9-a404-8b048b59c8fc\") " pod="openstack/barbican-bc6c-account-create-zcz47" Oct 03 11:15:56 crc kubenswrapper[4990]: I1003 11:15:56.166503 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bc6c-account-create-zcz47" Oct 03 11:15:56 crc kubenswrapper[4990]: I1003 11:15:56.708707 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bc6c-account-create-zcz47"] Oct 03 11:15:57 crc kubenswrapper[4990]: I1003 11:15:57.717454 4990 generic.go:334] "Generic (PLEG): container finished" podID="146d3ad2-f23d-47b9-a404-8b048b59c8fc" containerID="5eba6f43018d61a6824deadc2094ff2cb7391095777adbdd5d88502a859b6087" exitCode=0 Oct 03 11:15:57 crc kubenswrapper[4990]: I1003 11:15:57.717530 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bc6c-account-create-zcz47" event={"ID":"146d3ad2-f23d-47b9-a404-8b048b59c8fc","Type":"ContainerDied","Data":"5eba6f43018d61a6824deadc2094ff2cb7391095777adbdd5d88502a859b6087"} Oct 03 11:15:57 crc kubenswrapper[4990]: I1003 11:15:57.717893 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bc6c-account-create-zcz47" event={"ID":"146d3ad2-f23d-47b9-a404-8b048b59c8fc","Type":"ContainerStarted","Data":"d31851828a944c3b68ab143d63a0be52bc95e8df448d000a343bca026f92b818"} Oct 03 11:15:59 crc kubenswrapper[4990]: I1003 11:15:59.071790 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bc6c-account-create-zcz47" Oct 03 11:15:59 crc kubenswrapper[4990]: I1003 11:15:59.243972 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hknbw\" (UniqueName: \"kubernetes.io/projected/146d3ad2-f23d-47b9-a404-8b048b59c8fc-kube-api-access-hknbw\") pod \"146d3ad2-f23d-47b9-a404-8b048b59c8fc\" (UID: \"146d3ad2-f23d-47b9-a404-8b048b59c8fc\") " Oct 03 11:15:59 crc kubenswrapper[4990]: I1003 11:15:59.260900 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146d3ad2-f23d-47b9-a404-8b048b59c8fc-kube-api-access-hknbw" (OuterVolumeSpecName: "kube-api-access-hknbw") pod "146d3ad2-f23d-47b9-a404-8b048b59c8fc" (UID: "146d3ad2-f23d-47b9-a404-8b048b59c8fc"). InnerVolumeSpecName "kube-api-access-hknbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:15:59 crc kubenswrapper[4990]: I1003 11:15:59.345818 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hknbw\" (UniqueName: \"kubernetes.io/projected/146d3ad2-f23d-47b9-a404-8b048b59c8fc-kube-api-access-hknbw\") on node \"crc\" DevicePath \"\"" Oct 03 11:15:59 crc kubenswrapper[4990]: I1003 11:15:59.732770 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bc6c-account-create-zcz47" event={"ID":"146d3ad2-f23d-47b9-a404-8b048b59c8fc","Type":"ContainerDied","Data":"d31851828a944c3b68ab143d63a0be52bc95e8df448d000a343bca026f92b818"} Oct 03 11:15:59 crc kubenswrapper[4990]: I1003 11:15:59.732808 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d31851828a944c3b68ab143d63a0be52bc95e8df448d000a343bca026f92b818" Oct 03 11:15:59 crc kubenswrapper[4990]: I1003 11:15:59.732824 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bc6c-account-create-zcz47" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.014621 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6qnlk"] Oct 03 11:16:01 crc kubenswrapper[4990]: E1003 11:16:01.015195 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146d3ad2-f23d-47b9-a404-8b048b59c8fc" containerName="mariadb-account-create" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.015208 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="146d3ad2-f23d-47b9-a404-8b048b59c8fc" containerName="mariadb-account-create" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.015373 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="146d3ad2-f23d-47b9-a404-8b048b59c8fc" containerName="mariadb-account-create" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.015911 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.018181 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.019792 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rqxp4" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.031875 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6qnlk"] Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.178757 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-db-sync-config-data\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.178827 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59t7p\" (UniqueName: \"kubernetes.io/projected/5f9ecda3-14e3-4135-af9d-e975c262734d-kube-api-access-59t7p\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.178862 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-combined-ca-bundle\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.280112 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-db-sync-config-data\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.280194 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59t7p\" (UniqueName: \"kubernetes.io/projected/5f9ecda3-14e3-4135-af9d-e975c262734d-kube-api-access-59t7p\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.280227 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-combined-ca-bundle\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.288418 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-combined-ca-bundle\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.292073 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-db-sync-config-data\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.311081 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59t7p\" (UniqueName: \"kubernetes.io/projected/5f9ecda3-14e3-4135-af9d-e975c262734d-kube-api-access-59t7p\") pod \"barbican-db-sync-6qnlk\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.360803 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:01 crc kubenswrapper[4990]: I1003 11:16:01.808745 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6qnlk"] Oct 03 11:16:02 crc kubenswrapper[4990]: I1003 11:16:02.759057 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6qnlk" event={"ID":"5f9ecda3-14e3-4135-af9d-e975c262734d","Type":"ContainerStarted","Data":"4e155499c1209a30f5b907aaef9340aa5a8adf92e8fda1755972c120c85b5bec"} Oct 03 11:16:02 crc kubenswrapper[4990]: I1003 11:16:02.759557 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6qnlk" event={"ID":"5f9ecda3-14e3-4135-af9d-e975c262734d","Type":"ContainerStarted","Data":"543102a2d2aad3a73a2c155be645a9eeda085158f1239bf06afd849468dacc73"} Oct 03 11:16:02 crc kubenswrapper[4990]: I1003 11:16:02.787779 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6qnlk" podStartSLOduration=2.787753475 podStartE2EDuration="2.787753475s" podCreationTimestamp="2025-10-03 11:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:02.779728089 +0000 UTC m=+5544.576359966" watchObservedRunningTime="2025-10-03 11:16:02.787753475 +0000 UTC m=+5544.584385352" Oct 03 11:16:04 crc kubenswrapper[4990]: I1003 11:16:04.776487 4990 generic.go:334] "Generic (PLEG): container finished" podID="5f9ecda3-14e3-4135-af9d-e975c262734d" containerID="4e155499c1209a30f5b907aaef9340aa5a8adf92e8fda1755972c120c85b5bec" exitCode=0 Oct 03 11:16:04 crc kubenswrapper[4990]: I1003 11:16:04.776541 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6qnlk" event={"ID":"5f9ecda3-14e3-4135-af9d-e975c262734d","Type":"ContainerDied","Data":"4e155499c1209a30f5b907aaef9340aa5a8adf92e8fda1755972c120c85b5bec"} Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.178437 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.368599 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-combined-ca-bundle\") pod \"5f9ecda3-14e3-4135-af9d-e975c262734d\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.368674 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59t7p\" (UniqueName: \"kubernetes.io/projected/5f9ecda3-14e3-4135-af9d-e975c262734d-kube-api-access-59t7p\") pod \"5f9ecda3-14e3-4135-af9d-e975c262734d\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.368754 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-db-sync-config-data\") pod \"5f9ecda3-14e3-4135-af9d-e975c262734d\" (UID: \"5f9ecda3-14e3-4135-af9d-e975c262734d\") " Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.374385 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5f9ecda3-14e3-4135-af9d-e975c262734d" (UID: "5f9ecda3-14e3-4135-af9d-e975c262734d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.374867 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9ecda3-14e3-4135-af9d-e975c262734d-kube-api-access-59t7p" (OuterVolumeSpecName: "kube-api-access-59t7p") pod "5f9ecda3-14e3-4135-af9d-e975c262734d" (UID: "5f9ecda3-14e3-4135-af9d-e975c262734d"). InnerVolumeSpecName "kube-api-access-59t7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.390847 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f9ecda3-14e3-4135-af9d-e975c262734d" (UID: "5f9ecda3-14e3-4135-af9d-e975c262734d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.471897 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.471957 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59t7p\" (UniqueName: \"kubernetes.io/projected/5f9ecda3-14e3-4135-af9d-e975c262734d-kube-api-access-59t7p\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.471976 4990 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f9ecda3-14e3-4135-af9d-e975c262734d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.794998 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6qnlk" event={"ID":"5f9ecda3-14e3-4135-af9d-e975c262734d","Type":"ContainerDied","Data":"543102a2d2aad3a73a2c155be645a9eeda085158f1239bf06afd849468dacc73"} Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.795292 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="543102a2d2aad3a73a2c155be645a9eeda085158f1239bf06afd849468dacc73" Oct 03 11:16:06 crc kubenswrapper[4990]: I1003 11:16:06.795049 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6qnlk" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.061000 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c959bd486-zgkx8"] Oct 03 11:16:07 crc kubenswrapper[4990]: E1003 11:16:07.061340 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9ecda3-14e3-4135-af9d-e975c262734d" containerName="barbican-db-sync" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.061357 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9ecda3-14e3-4135-af9d-e975c262734d" containerName="barbican-db-sync" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.061524 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9ecda3-14e3-4135-af9d-e975c262734d" containerName="barbican-db-sync" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.062293 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.067189 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rqxp4" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.067308 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.067399 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.081455 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c959bd486-zgkx8"] Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.093849 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-75cfdf8777-z9jlb"] Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.095057 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.097594 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.131366 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-75cfdf8777-z9jlb"] Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.151176 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b8f884bf9-4dfpg"] Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.153337 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.177272 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b8f884bf9-4dfpg"] Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.183467 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-config-data-custom\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.183525 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-combined-ca-bundle\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.183567 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2xn\" (UniqueName: \"kubernetes.io/projected/745cbe6e-a927-48db-af8e-f94d85b6f484-kube-api-access-gk2xn\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.183589 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/745cbe6e-a927-48db-af8e-f94d85b6f484-logs\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.183686 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-config-data\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.221172 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fd759487d-5vrmr"] Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.224219 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.231195 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.233351 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fd759487d-5vrmr"] Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285722 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2t2\" (UniqueName: \"kubernetes.io/projected/5c935d0e-a506-4979-b44f-16d8ea969d14-kube-api-access-rn2t2\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285775 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95hqk\" (UniqueName: \"kubernetes.io/projected/bf17efcf-a86c-4335-bb56-3b52134a7ff6-kube-api-access-95hqk\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285806 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-combined-ca-bundle\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285831 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285856 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-config-data\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285894 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-config-data\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285915 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-config-data-custom\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285942 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-combined-ca-bundle\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285966 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-config-data-custom\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.285998 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2xn\" (UniqueName: \"kubernetes.io/projected/745cbe6e-a927-48db-af8e-f94d85b6f484-kube-api-access-gk2xn\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.286019 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/745cbe6e-a927-48db-af8e-f94d85b6f484-logs\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.286036 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-config\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.286050 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c935d0e-a506-4979-b44f-16d8ea969d14-logs\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.286078 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.286112 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-dns-svc\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.288306 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/745cbe6e-a927-48db-af8e-f94d85b6f484-logs\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.293730 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-config-data-custom\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.293862 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-config-data\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.305186 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/745cbe6e-a927-48db-af8e-f94d85b6f484-combined-ca-bundle\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.316035 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2xn\" (UniqueName: \"kubernetes.io/projected/745cbe6e-a927-48db-af8e-f94d85b6f484-kube-api-access-gk2xn\") pod \"barbican-keystone-listener-c959bd486-zgkx8\" (UID: \"745cbe6e-a927-48db-af8e-f94d85b6f484\") " pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.381587 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.387100 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-config-data\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.387137 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data-custom\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.387161 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn5bj\" (UniqueName: \"kubernetes.io/projected/7617a646-f374-41b3-ae8f-cbb5a0e7724f-kube-api-access-xn5bj\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.387191 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-config-data-custom\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.387324 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-config\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.387363 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c935d0e-a506-4979-b44f-16d8ea969d14-logs\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.387961 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c935d0e-a506-4979-b44f-16d8ea969d14-logs\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388379 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-config\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388433 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388557 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-dns-svc\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388696 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7617a646-f374-41b3-ae8f-cbb5a0e7724f-logs\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388731 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2t2\" (UniqueName: \"kubernetes.io/projected/5c935d0e-a506-4979-b44f-16d8ea969d14-kube-api-access-rn2t2\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388761 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95hqk\" (UniqueName: \"kubernetes.io/projected/bf17efcf-a86c-4335-bb56-3b52134a7ff6-kube-api-access-95hqk\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388790 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-combined-ca-bundle\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388825 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388863 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-combined-ca-bundle\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.388894 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.389338 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.389476 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.389869 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-dns-svc\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.394299 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-config-data-custom\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.394798 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-config-data\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.395283 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c935d0e-a506-4979-b44f-16d8ea969d14-combined-ca-bundle\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.407722 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2t2\" (UniqueName: \"kubernetes.io/projected/5c935d0e-a506-4979-b44f-16d8ea969d14-kube-api-access-rn2t2\") pod \"barbican-worker-75cfdf8777-z9jlb\" (UID: \"5c935d0e-a506-4979-b44f-16d8ea969d14\") " pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.408853 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95hqk\" (UniqueName: \"kubernetes.io/projected/bf17efcf-a86c-4335-bb56-3b52134a7ff6-kube-api-access-95hqk\") pod \"dnsmasq-dns-5b8f884bf9-4dfpg\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.429320 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75cfdf8777-z9jlb" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.490318 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7617a646-f374-41b3-ae8f-cbb5a0e7724f-logs\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.490369 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-combined-ca-bundle\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.490387 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.490425 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data-custom\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.490444 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn5bj\" (UniqueName: \"kubernetes.io/projected/7617a646-f374-41b3-ae8f-cbb5a0e7724f-kube-api-access-xn5bj\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.491575 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7617a646-f374-41b3-ae8f-cbb5a0e7724f-logs\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.496159 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-combined-ca-bundle\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.497308 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.498843 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data-custom\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.499488 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.524545 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn5bj\" (UniqueName: \"kubernetes.io/projected/7617a646-f374-41b3-ae8f-cbb5a0e7724f-kube-api-access-xn5bj\") pod \"barbican-api-6fd759487d-5vrmr\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.550202 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:07 crc kubenswrapper[4990]: I1003 11:16:07.909054 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c959bd486-zgkx8"] Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.011047 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-75cfdf8777-z9jlb"] Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.016902 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b8f884bf9-4dfpg"] Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.186477 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fd759487d-5vrmr"] Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.811133 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd759487d-5vrmr" event={"ID":"7617a646-f374-41b3-ae8f-cbb5a0e7724f","Type":"ContainerStarted","Data":"e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.811413 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd759487d-5vrmr" event={"ID":"7617a646-f374-41b3-ae8f-cbb5a0e7724f","Type":"ContainerStarted","Data":"6024c53da2e52cd5cdfe7c267f9aa7e6cb34a42e30d001487da0950bcf9157f8"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.812797 4990 generic.go:334] "Generic (PLEG): container finished" podID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerID="00975977b70c14e762c13a40ea1be06edca709faa936d6968528eb1b397359ee" exitCode=0 Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.812866 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" event={"ID":"bf17efcf-a86c-4335-bb56-3b52134a7ff6","Type":"ContainerDied","Data":"00975977b70c14e762c13a40ea1be06edca709faa936d6968528eb1b397359ee"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.812897 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" event={"ID":"bf17efcf-a86c-4335-bb56-3b52134a7ff6","Type":"ContainerStarted","Data":"59224563cd572557782a396412f3e526862cc84e7844b9b010a4e0c26f9b2d89"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.820257 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" event={"ID":"745cbe6e-a927-48db-af8e-f94d85b6f484","Type":"ContainerStarted","Data":"68be06476db925c09a1ae3f23136ab4b64418fda97dd331baec3efb850710548"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.820596 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" event={"ID":"745cbe6e-a927-48db-af8e-f94d85b6f484","Type":"ContainerStarted","Data":"f3e1d6f87fb47ccb311aecbd5e2fd0653a129e8fe2bc498e3023367f7c87cc6e"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.820609 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" event={"ID":"745cbe6e-a927-48db-af8e-f94d85b6f484","Type":"ContainerStarted","Data":"4abd8e51d600af17269344b25e8072a1300938f2c4b50f26c68634d958e60f66"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.824273 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75cfdf8777-z9jlb" event={"ID":"5c935d0e-a506-4979-b44f-16d8ea969d14","Type":"ContainerStarted","Data":"de4ffedbe47545532b82193bfd192d74a7ba4c62c431d7ff21582279f9f419cb"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.824315 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75cfdf8777-z9jlb" event={"ID":"5c935d0e-a506-4979-b44f-16d8ea969d14","Type":"ContainerStarted","Data":"60dbeacdf94bb6e0f334dd0202ee0bba97dda28c452e741d1c759b9c7ae82a5b"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.824326 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75cfdf8777-z9jlb" event={"ID":"5c935d0e-a506-4979-b44f-16d8ea969d14","Type":"ContainerStarted","Data":"f571e69347f0f55c538e8d309a0a08642754e9beb69964ad967ffa8d0956ef38"} Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.923778 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c959bd486-zgkx8" podStartSLOduration=1.923756543 podStartE2EDuration="1.923756543s" podCreationTimestamp="2025-10-03 11:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:08.87412222 +0000 UTC m=+5550.670754077" watchObservedRunningTime="2025-10-03 11:16:08.923756543 +0000 UTC m=+5550.720388410" Oct 03 11:16:08 crc kubenswrapper[4990]: I1003 11:16:08.927354 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-75cfdf8777-z9jlb" podStartSLOduration=1.927343915 podStartE2EDuration="1.927343915s" podCreationTimestamp="2025-10-03 11:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:08.907079975 +0000 UTC m=+5550.703711832" watchObservedRunningTime="2025-10-03 11:16:08.927343915 +0000 UTC m=+5550.723975772" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.429811 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f7b959788-gvwmt"] Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.431196 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.440765 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.441133 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.463321 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f7b959788-gvwmt"] Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.530743 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-combined-ca-bundle\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.531401 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-internal-tls-certs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.531542 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-config-data-custom\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.531805 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46c27\" (UniqueName: \"kubernetes.io/projected/a570c5ff-a74e-4741-8baa-036a4dfd9423-kube-api-access-46c27\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.531959 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-public-tls-certs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.532003 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a570c5ff-a74e-4741-8baa-036a4dfd9423-logs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.532039 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-config-data\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.633257 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-internal-tls-certs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.633657 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-config-data-custom\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.633845 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46c27\" (UniqueName: \"kubernetes.io/projected/a570c5ff-a74e-4741-8baa-036a4dfd9423-kube-api-access-46c27\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.634367 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-public-tls-certs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.634913 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a570c5ff-a74e-4741-8baa-036a4dfd9423-logs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.635090 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-config-data\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.635249 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-combined-ca-bundle\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.635691 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a570c5ff-a74e-4741-8baa-036a4dfd9423-logs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.638245 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-public-tls-certs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.638753 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-internal-tls-certs\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.638978 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-combined-ca-bundle\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.641438 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-config-data\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.656457 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a570c5ff-a74e-4741-8baa-036a4dfd9423-config-data-custom\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.658993 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46c27\" (UniqueName: \"kubernetes.io/projected/a570c5ff-a74e-4741-8baa-036a4dfd9423-kube-api-access-46c27\") pod \"barbican-api-6f7b959788-gvwmt\" (UID: \"a570c5ff-a74e-4741-8baa-036a4dfd9423\") " pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.761681 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.834327 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd759487d-5vrmr" event={"ID":"7617a646-f374-41b3-ae8f-cbb5a0e7724f","Type":"ContainerStarted","Data":"f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6"} Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.834443 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.856120 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" event={"ID":"bf17efcf-a86c-4335-bb56-3b52134a7ff6","Type":"ContainerStarted","Data":"60925d16550e1f42fd4f9244844d930d27a5bf3a2967905822bb9a501f4b819a"} Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.856845 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:09 crc kubenswrapper[4990]: I1003 11:16:09.876000 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fd759487d-5vrmr" podStartSLOduration=2.875979998 podStartE2EDuration="2.875979998s" podCreationTimestamp="2025-10-03 11:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:09.867396028 +0000 UTC m=+5551.664027885" watchObservedRunningTime="2025-10-03 11:16:09.875979998 +0000 UTC m=+5551.672611855" Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.208744 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" podStartSLOduration=3.20872847 podStartE2EDuration="3.20872847s" podCreationTimestamp="2025-10-03 11:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:09.888970181 +0000 UTC m=+5551.685602038" watchObservedRunningTime="2025-10-03 11:16:10.20872847 +0000 UTC m=+5552.005360327" Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.215387 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f7b959788-gvwmt"] Oct 03 11:16:10 crc kubenswrapper[4990]: W1003 11:16:10.222831 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda570c5ff_a74e_4741_8baa_036a4dfd9423.slice/crio-9eef3bad27f5ca1397cd4340e0d83f6ac190f0b03f15a513990b0c0529988d50 WatchSource:0}: Error finding container 9eef3bad27f5ca1397cd4340e0d83f6ac190f0b03f15a513990b0c0529988d50: Status 404 returned error can't find the container with id 9eef3bad27f5ca1397cd4340e0d83f6ac190f0b03f15a513990b0c0529988d50 Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.867717 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b959788-gvwmt" event={"ID":"a570c5ff-a74e-4741-8baa-036a4dfd9423","Type":"ContainerStarted","Data":"8e2f3075ec11852c687cf39bfd143b0b2e6bf368a52209ffc106633234f4ac13"} Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.868019 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b959788-gvwmt" event={"ID":"a570c5ff-a74e-4741-8baa-036a4dfd9423","Type":"ContainerStarted","Data":"0caf8bd1e9129f4c99d43bec2921befc7228ed48ffcd477d2fdaa623efd4f950"} Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.868036 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f7b959788-gvwmt" event={"ID":"a570c5ff-a74e-4741-8baa-036a4dfd9423","Type":"ContainerStarted","Data":"9eef3bad27f5ca1397cd4340e0d83f6ac190f0b03f15a513990b0c0529988d50"} Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.868989 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.869037 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.869052 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:10 crc kubenswrapper[4990]: I1003 11:16:10.900718 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f7b959788-gvwmt" podStartSLOduration=1.9006972709999999 podStartE2EDuration="1.900697271s" podCreationTimestamp="2025-10-03 11:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:10.892692635 +0000 UTC m=+5552.689324502" watchObservedRunningTime="2025-10-03 11:16:10.900697271 +0000 UTC m=+5552.697329128" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.085457 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dxfd2"] Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.087702 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.124052 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dxfd2"] Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.167124 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-utilities\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.167316 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-catalog-content\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.167363 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq82v\" (UniqueName: \"kubernetes.io/projected/02d49fbc-3844-48a3-94ff-1595988d30cc-kube-api-access-pq82v\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.268827 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-utilities\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.268952 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-catalog-content\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.268986 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq82v\" (UniqueName: \"kubernetes.io/projected/02d49fbc-3844-48a3-94ff-1595988d30cc-kube-api-access-pq82v\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.269386 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-utilities\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.269476 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-catalog-content\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.289376 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq82v\" (UniqueName: \"kubernetes.io/projected/02d49fbc-3844-48a3-94ff-1595988d30cc-kube-api-access-pq82v\") pod \"redhat-operators-dxfd2\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.408911 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:11 crc kubenswrapper[4990]: I1003 11:16:11.885240 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dxfd2"] Oct 03 11:16:12 crc kubenswrapper[4990]: I1003 11:16:12.887235 4990 generic.go:334] "Generic (PLEG): container finished" podID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerID="c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec" exitCode=0 Oct 03 11:16:12 crc kubenswrapper[4990]: I1003 11:16:12.887300 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxfd2" event={"ID":"02d49fbc-3844-48a3-94ff-1595988d30cc","Type":"ContainerDied","Data":"c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec"} Oct 03 11:16:12 crc kubenswrapper[4990]: I1003 11:16:12.887640 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxfd2" event={"ID":"02d49fbc-3844-48a3-94ff-1595988d30cc","Type":"ContainerStarted","Data":"244bae1d3a3ff7f350177022545ace3fb54794be6376ea558ce1d2e0473c9ebb"} Oct 03 11:16:14 crc kubenswrapper[4990]: I1003 11:16:14.911143 4990 generic.go:334] "Generic (PLEG): container finished" podID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerID="7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac" exitCode=0 Oct 03 11:16:14 crc kubenswrapper[4990]: I1003 11:16:14.911463 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxfd2" event={"ID":"02d49fbc-3844-48a3-94ff-1595988d30cc","Type":"ContainerDied","Data":"7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac"} Oct 03 11:16:15 crc kubenswrapper[4990]: I1003 11:16:15.923027 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxfd2" event={"ID":"02d49fbc-3844-48a3-94ff-1595988d30cc","Type":"ContainerStarted","Data":"b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566"} Oct 03 11:16:15 crc kubenswrapper[4990]: I1003 11:16:15.951111 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dxfd2" podStartSLOduration=2.549912027 podStartE2EDuration="4.951095223s" podCreationTimestamp="2025-10-03 11:16:11 +0000 UTC" firstStartedPulling="2025-10-03 11:16:12.8897499 +0000 UTC m=+5554.686381757" lastFinishedPulling="2025-10-03 11:16:15.290933106 +0000 UTC m=+5557.087564953" observedRunningTime="2025-10-03 11:16:15.948560488 +0000 UTC m=+5557.745192365" watchObservedRunningTime="2025-10-03 11:16:15.951095223 +0000 UTC m=+5557.747727080" Oct 03 11:16:17 crc kubenswrapper[4990]: I1003 11:16:17.500659 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:16:17 crc kubenswrapper[4990]: I1003 11:16:17.592542 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f54884d97-gblkk"] Oct 03 11:16:17 crc kubenswrapper[4990]: I1003 11:16:17.592779 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" podUID="2ea090cc-48fe-4692-9b04-61efe1cf1770" containerName="dnsmasq-dns" containerID="cri-o://d2cb415e5d02e9652aa908423c69f346346639185ea583bd0af35904f4ca4402" gracePeriod=10 Oct 03 11:16:17 crc kubenswrapper[4990]: I1003 11:16:17.946689 4990 generic.go:334] "Generic (PLEG): container finished" podID="2ea090cc-48fe-4692-9b04-61efe1cf1770" containerID="d2cb415e5d02e9652aa908423c69f346346639185ea583bd0af35904f4ca4402" exitCode=0 Oct 03 11:16:17 crc kubenswrapper[4990]: I1003 11:16:17.946730 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" event={"ID":"2ea090cc-48fe-4692-9b04-61efe1cf1770","Type":"ContainerDied","Data":"d2cb415e5d02e9652aa908423c69f346346639185ea583bd0af35904f4ca4402"} Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.087433 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.187370 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k42mb\" (UniqueName: \"kubernetes.io/projected/2ea090cc-48fe-4692-9b04-61efe1cf1770-kube-api-access-k42mb\") pod \"2ea090cc-48fe-4692-9b04-61efe1cf1770\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.187450 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-nb\") pod \"2ea090cc-48fe-4692-9b04-61efe1cf1770\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.187470 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-config\") pod \"2ea090cc-48fe-4692-9b04-61efe1cf1770\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.187685 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-dns-svc\") pod \"2ea090cc-48fe-4692-9b04-61efe1cf1770\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.187768 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-sb\") pod \"2ea090cc-48fe-4692-9b04-61efe1cf1770\" (UID: \"2ea090cc-48fe-4692-9b04-61efe1cf1770\") " Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.236693 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea090cc-48fe-4692-9b04-61efe1cf1770-kube-api-access-k42mb" (OuterVolumeSpecName: "kube-api-access-k42mb") pod "2ea090cc-48fe-4692-9b04-61efe1cf1770" (UID: "2ea090cc-48fe-4692-9b04-61efe1cf1770"). InnerVolumeSpecName "kube-api-access-k42mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.237210 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ea090cc-48fe-4692-9b04-61efe1cf1770" (UID: "2ea090cc-48fe-4692-9b04-61efe1cf1770"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.247754 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ea090cc-48fe-4692-9b04-61efe1cf1770" (UID: "2ea090cc-48fe-4692-9b04-61efe1cf1770"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.248611 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ea090cc-48fe-4692-9b04-61efe1cf1770" (UID: "2ea090cc-48fe-4692-9b04-61efe1cf1770"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.274179 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-config" (OuterVolumeSpecName: "config") pod "2ea090cc-48fe-4692-9b04-61efe1cf1770" (UID: "2ea090cc-48fe-4692-9b04-61efe1cf1770"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.290211 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.290250 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.290266 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k42mb\" (UniqueName: \"kubernetes.io/projected/2ea090cc-48fe-4692-9b04-61efe1cf1770-kube-api-access-k42mb\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.290280 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.290292 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ea090cc-48fe-4692-9b04-61efe1cf1770-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.974917 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" event={"ID":"2ea090cc-48fe-4692-9b04-61efe1cf1770","Type":"ContainerDied","Data":"827201c32bf15e0241dc27a21327750ec8a04e4eda2a0d46833a4cf836b6e4de"} Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.975389 4990 scope.go:117] "RemoveContainer" containerID="d2cb415e5d02e9652aa908423c69f346346639185ea583bd0af35904f4ca4402" Oct 03 11:16:18 crc kubenswrapper[4990]: I1003 11:16:18.975584 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f54884d97-gblkk" Oct 03 11:16:19 crc kubenswrapper[4990]: I1003 11:16:19.008070 4990 scope.go:117] "RemoveContainer" containerID="d4300692e3248b61c74ddd02d2a9e8c4bda6d45502ca21a5220710480195ffd8" Oct 03 11:16:19 crc kubenswrapper[4990]: I1003 11:16:19.018022 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f54884d97-gblkk"] Oct 03 11:16:19 crc kubenswrapper[4990]: I1003 11:16:19.027269 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f54884d97-gblkk"] Oct 03 11:16:19 crc kubenswrapper[4990]: I1003 11:16:19.137052 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:19 crc kubenswrapper[4990]: I1003 11:16:19.289875 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:20 crc kubenswrapper[4990]: I1003 11:16:20.884006 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea090cc-48fe-4692-9b04-61efe1cf1770" path="/var/lib/kubelet/pods/2ea090cc-48fe-4692-9b04-61efe1cf1770/volumes" Oct 03 11:16:21 crc kubenswrapper[4990]: I1003 11:16:21.201467 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:21 crc kubenswrapper[4990]: I1003 11:16:21.244661 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f7b959788-gvwmt" Oct 03 11:16:21 crc kubenswrapper[4990]: I1003 11:16:21.330326 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fd759487d-5vrmr"] Oct 03 11:16:21 crc kubenswrapper[4990]: I1003 11:16:21.330692 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fd759487d-5vrmr" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api-log" containerID="cri-o://e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87" gracePeriod=30 Oct 03 11:16:21 crc kubenswrapper[4990]: I1003 11:16:21.331173 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fd759487d-5vrmr" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api" containerID="cri-o://f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6" gracePeriod=30 Oct 03 11:16:21 crc kubenswrapper[4990]: I1003 11:16:21.410096 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:21 crc kubenswrapper[4990]: I1003 11:16:21.410161 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:21 crc kubenswrapper[4990]: I1003 11:16:21.488745 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:22 crc kubenswrapper[4990]: I1003 11:16:22.003435 4990 generic.go:334] "Generic (PLEG): container finished" podID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerID="e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87" exitCode=143 Oct 03 11:16:22 crc kubenswrapper[4990]: I1003 11:16:22.003562 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd759487d-5vrmr" event={"ID":"7617a646-f374-41b3-ae8f-cbb5a0e7724f","Type":"ContainerDied","Data":"e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87"} Oct 03 11:16:22 crc kubenswrapper[4990]: I1003 11:16:22.057266 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:22 crc kubenswrapper[4990]: I1003 11:16:22.119372 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dxfd2"] Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.023028 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dxfd2" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerName="registry-server" containerID="cri-o://b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566" gracePeriod=2 Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.475097 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.484388 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fd759487d-5vrmr" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.36:9311/healthcheck\": read tcp 10.217.0.2:49314->10.217.1.36:9311: read: connection reset by peer" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.484402 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fd759487d-5vrmr" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.36:9311/healthcheck\": read tcp 10.217.0.2:49298->10.217.1.36:9311: read: connection reset by peer" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.633287 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-catalog-content\") pod \"02d49fbc-3844-48a3-94ff-1595988d30cc\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.633343 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-utilities\") pod \"02d49fbc-3844-48a3-94ff-1595988d30cc\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.633442 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq82v\" (UniqueName: \"kubernetes.io/projected/02d49fbc-3844-48a3-94ff-1595988d30cc-kube-api-access-pq82v\") pod \"02d49fbc-3844-48a3-94ff-1595988d30cc\" (UID: \"02d49fbc-3844-48a3-94ff-1595988d30cc\") " Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.634163 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-utilities" (OuterVolumeSpecName: "utilities") pod "02d49fbc-3844-48a3-94ff-1595988d30cc" (UID: "02d49fbc-3844-48a3-94ff-1595988d30cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.642326 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d49fbc-3844-48a3-94ff-1595988d30cc-kube-api-access-pq82v" (OuterVolumeSpecName: "kube-api-access-pq82v") pod "02d49fbc-3844-48a3-94ff-1595988d30cc" (UID: "02d49fbc-3844-48a3-94ff-1595988d30cc"). InnerVolumeSpecName "kube-api-access-pq82v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.735796 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq82v\" (UniqueName: \"kubernetes.io/projected/02d49fbc-3844-48a3-94ff-1595988d30cc-kube-api-access-pq82v\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.735845 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.756910 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02d49fbc-3844-48a3-94ff-1595988d30cc" (UID: "02d49fbc-3844-48a3-94ff-1595988d30cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.782409 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.838495 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d49fbc-3844-48a3-94ff-1595988d30cc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.940031 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn5bj\" (UniqueName: \"kubernetes.io/projected/7617a646-f374-41b3-ae8f-cbb5a0e7724f-kube-api-access-xn5bj\") pod \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.940236 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data-custom\") pod \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.940299 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7617a646-f374-41b3-ae8f-cbb5a0e7724f-logs\") pod \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.940346 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data\") pod \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.940394 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-combined-ca-bundle\") pod \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\" (UID: \"7617a646-f374-41b3-ae8f-cbb5a0e7724f\") " Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.940932 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7617a646-f374-41b3-ae8f-cbb5a0e7724f-logs" (OuterVolumeSpecName: "logs") pod "7617a646-f374-41b3-ae8f-cbb5a0e7724f" (UID: "7617a646-f374-41b3-ae8f-cbb5a0e7724f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.942048 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7617a646-f374-41b3-ae8f-cbb5a0e7724f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.944504 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7617a646-f374-41b3-ae8f-cbb5a0e7724f-kube-api-access-xn5bj" (OuterVolumeSpecName: "kube-api-access-xn5bj") pod "7617a646-f374-41b3-ae8f-cbb5a0e7724f" (UID: "7617a646-f374-41b3-ae8f-cbb5a0e7724f"). InnerVolumeSpecName "kube-api-access-xn5bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.946886 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7617a646-f374-41b3-ae8f-cbb5a0e7724f" (UID: "7617a646-f374-41b3-ae8f-cbb5a0e7724f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.972703 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7617a646-f374-41b3-ae8f-cbb5a0e7724f" (UID: "7617a646-f374-41b3-ae8f-cbb5a0e7724f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:16:24 crc kubenswrapper[4990]: I1003 11:16:24.993597 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data" (OuterVolumeSpecName: "config-data") pod "7617a646-f374-41b3-ae8f-cbb5a0e7724f" (UID: "7617a646-f374-41b3-ae8f-cbb5a0e7724f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.034339 4990 generic.go:334] "Generic (PLEG): container finished" podID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerID="f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6" exitCode=0 Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.034452 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd759487d-5vrmr" event={"ID":"7617a646-f374-41b3-ae8f-cbb5a0e7724f","Type":"ContainerDied","Data":"f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6"} Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.034479 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd759487d-5vrmr" event={"ID":"7617a646-f374-41b3-ae8f-cbb5a0e7724f","Type":"ContainerDied","Data":"6024c53da2e52cd5cdfe7c267f9aa7e6cb34a42e30d001487da0950bcf9157f8"} Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.034496 4990 scope.go:117] "RemoveContainer" containerID="f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.035636 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd759487d-5vrmr" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.037703 4990 generic.go:334] "Generic (PLEG): container finished" podID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerID="b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566" exitCode=0 Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.037742 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxfd2" event={"ID":"02d49fbc-3844-48a3-94ff-1595988d30cc","Type":"ContainerDied","Data":"b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566"} Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.037773 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dxfd2" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.037792 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dxfd2" event={"ID":"02d49fbc-3844-48a3-94ff-1595988d30cc","Type":"ContainerDied","Data":"244bae1d3a3ff7f350177022545ace3fb54794be6376ea558ce1d2e0473c9ebb"} Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.044788 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn5bj\" (UniqueName: \"kubernetes.io/projected/7617a646-f374-41b3-ae8f-cbb5a0e7724f-kube-api-access-xn5bj\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.044836 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.044857 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.044876 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617a646-f374-41b3-ae8f-cbb5a0e7724f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.064697 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dxfd2"] Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.070375 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dxfd2"] Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.078309 4990 scope.go:117] "RemoveContainer" containerID="e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.096115 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fd759487d-5vrmr"] Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.102928 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6fd759487d-5vrmr"] Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.106091 4990 scope.go:117] "RemoveContainer" containerID="f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6" Oct 03 11:16:25 crc kubenswrapper[4990]: E1003 11:16:25.106766 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6\": container with ID starting with f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6 not found: ID does not exist" containerID="f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.106801 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6"} err="failed to get container status \"f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6\": rpc error: code = NotFound desc = could not find container \"f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6\": container with ID starting with f11c42fcb05789c8eeacc7651c5c9ba971d3ebd9b568dc44c96893264dbdc7d6 not found: ID does not exist" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.106823 4990 scope.go:117] "RemoveContainer" containerID="e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87" Oct 03 11:16:25 crc kubenswrapper[4990]: E1003 11:16:25.107295 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87\": container with ID starting with e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87 not found: ID does not exist" containerID="e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.107428 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87"} err="failed to get container status \"e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87\": rpc error: code = NotFound desc = could not find container \"e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87\": container with ID starting with e2ddea130e52dd8cbcad28d513112a5c79bfa6444fcbe42f46cf114323ab5b87 not found: ID does not exist" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.107558 4990 scope.go:117] "RemoveContainer" containerID="b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.133064 4990 scope.go:117] "RemoveContainer" containerID="7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.203488 4990 scope.go:117] "RemoveContainer" containerID="c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.243399 4990 scope.go:117] "RemoveContainer" containerID="b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566" Oct 03 11:16:25 crc kubenswrapper[4990]: E1003 11:16:25.243932 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566\": container with ID starting with b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566 not found: ID does not exist" containerID="b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.243964 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566"} err="failed to get container status \"b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566\": rpc error: code = NotFound desc = could not find container \"b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566\": container with ID starting with b3f2fee3b332f1e882f06ecb47f29a565e05d780b1e52e48fc62d059a503b566 not found: ID does not exist" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.243984 4990 scope.go:117] "RemoveContainer" containerID="7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac" Oct 03 11:16:25 crc kubenswrapper[4990]: E1003 11:16:25.244333 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac\": container with ID starting with 7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac not found: ID does not exist" containerID="7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.244392 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac"} err="failed to get container status \"7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac\": rpc error: code = NotFound desc = could not find container \"7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac\": container with ID starting with 7de3b4db8767e4b379106aec88003c0973a11565e80b8ea5ff6f307822b108ac not found: ID does not exist" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.244426 4990 scope.go:117] "RemoveContainer" containerID="c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec" Oct 03 11:16:25 crc kubenswrapper[4990]: E1003 11:16:25.245072 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec\": container with ID starting with c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec not found: ID does not exist" containerID="c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec" Oct 03 11:16:25 crc kubenswrapper[4990]: I1003 11:16:25.245281 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec"} err="failed to get container status \"c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec\": rpc error: code = NotFound desc = could not find container \"c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec\": container with ID starting with c64b5e18dccc38f6cdc9975ffb91d19a86719dea1fee258ec795f44315b8a1ec not found: ID does not exist" Oct 03 11:16:26 crc kubenswrapper[4990]: I1003 11:16:26.894474 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" path="/var/lib/kubelet/pods/02d49fbc-3844-48a3-94ff-1595988d30cc/volumes" Oct 03 11:16:26 crc kubenswrapper[4990]: I1003 11:16:26.897159 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" path="/var/lib/kubelet/pods/7617a646-f374-41b3-ae8f-cbb5a0e7724f/volumes" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.053833 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-x4wxk"] Oct 03 11:16:33 crc kubenswrapper[4990]: E1003 11:16:33.058679 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.058969 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api" Oct 03 11:16:33 crc kubenswrapper[4990]: E1003 11:16:33.059050 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerName="registry-server" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.059126 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerName="registry-server" Oct 03 11:16:33 crc kubenswrapper[4990]: E1003 11:16:33.059210 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerName="extract-content" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.059312 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerName="extract-content" Oct 03 11:16:33 crc kubenswrapper[4990]: E1003 11:16:33.059394 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api-log" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.059464 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api-log" Oct 03 11:16:33 crc kubenswrapper[4990]: E1003 11:16:33.059569 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea090cc-48fe-4692-9b04-61efe1cf1770" containerName="init" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.059647 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea090cc-48fe-4692-9b04-61efe1cf1770" containerName="init" Oct 03 11:16:33 crc kubenswrapper[4990]: E1003 11:16:33.059742 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerName="extract-utilities" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.059982 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerName="extract-utilities" Oct 03 11:16:33 crc kubenswrapper[4990]: E1003 11:16:33.060068 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea090cc-48fe-4692-9b04-61efe1cf1770" containerName="dnsmasq-dns" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.060137 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea090cc-48fe-4692-9b04-61efe1cf1770" containerName="dnsmasq-dns" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.060410 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.060497 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea090cc-48fe-4692-9b04-61efe1cf1770" containerName="dnsmasq-dns" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.060638 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d49fbc-3844-48a3-94ff-1595988d30cc" containerName="registry-server" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.060730 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7617a646-f374-41b3-ae8f-cbb5a0e7724f" containerName="barbican-api-log" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.062109 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-x4wxk" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.066874 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-x4wxk"] Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.201452 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqjh\" (UniqueName: \"kubernetes.io/projected/78cd9c7b-bbc1-4b75-a466-d98ae13d1883-kube-api-access-rgqjh\") pod \"neutron-db-create-x4wxk\" (UID: \"78cd9c7b-bbc1-4b75-a466-d98ae13d1883\") " pod="openstack/neutron-db-create-x4wxk" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.303293 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqjh\" (UniqueName: \"kubernetes.io/projected/78cd9c7b-bbc1-4b75-a466-d98ae13d1883-kube-api-access-rgqjh\") pod \"neutron-db-create-x4wxk\" (UID: \"78cd9c7b-bbc1-4b75-a466-d98ae13d1883\") " pod="openstack/neutron-db-create-x4wxk" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.326766 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqjh\" (UniqueName: \"kubernetes.io/projected/78cd9c7b-bbc1-4b75-a466-d98ae13d1883-kube-api-access-rgqjh\") pod \"neutron-db-create-x4wxk\" (UID: \"78cd9c7b-bbc1-4b75-a466-d98ae13d1883\") " pod="openstack/neutron-db-create-x4wxk" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.384326 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-x4wxk" Oct 03 11:16:33 crc kubenswrapper[4990]: I1003 11:16:33.817489 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-x4wxk"] Oct 03 11:16:34 crc kubenswrapper[4990]: I1003 11:16:34.133820 4990 generic.go:334] "Generic (PLEG): container finished" podID="78cd9c7b-bbc1-4b75-a466-d98ae13d1883" containerID="b11d11103862df7a7c17e9afe8d302f90948e74a4431307b88d15155d1bf7246" exitCode=0 Oct 03 11:16:34 crc kubenswrapper[4990]: I1003 11:16:34.133879 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-x4wxk" event={"ID":"78cd9c7b-bbc1-4b75-a466-d98ae13d1883","Type":"ContainerDied","Data":"b11d11103862df7a7c17e9afe8d302f90948e74a4431307b88d15155d1bf7246"} Oct 03 11:16:34 crc kubenswrapper[4990]: I1003 11:16:34.133922 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-x4wxk" event={"ID":"78cd9c7b-bbc1-4b75-a466-d98ae13d1883","Type":"ContainerStarted","Data":"5f06c8780de5edeafd30d47318b1842dc5f988de3c1fe333e12c2a61fcbd0078"} Oct 03 11:16:35 crc kubenswrapper[4990]: I1003 11:16:35.468697 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-x4wxk" Oct 03 11:16:35 crc kubenswrapper[4990]: I1003 11:16:35.645845 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgqjh\" (UniqueName: \"kubernetes.io/projected/78cd9c7b-bbc1-4b75-a466-d98ae13d1883-kube-api-access-rgqjh\") pod \"78cd9c7b-bbc1-4b75-a466-d98ae13d1883\" (UID: \"78cd9c7b-bbc1-4b75-a466-d98ae13d1883\") " Oct 03 11:16:35 crc kubenswrapper[4990]: I1003 11:16:35.653828 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cd9c7b-bbc1-4b75-a466-d98ae13d1883-kube-api-access-rgqjh" (OuterVolumeSpecName: "kube-api-access-rgqjh") pod "78cd9c7b-bbc1-4b75-a466-d98ae13d1883" (UID: "78cd9c7b-bbc1-4b75-a466-d98ae13d1883"). InnerVolumeSpecName "kube-api-access-rgqjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:16:35 crc kubenswrapper[4990]: I1003 11:16:35.748466 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgqjh\" (UniqueName: \"kubernetes.io/projected/78cd9c7b-bbc1-4b75-a466-d98ae13d1883-kube-api-access-rgqjh\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:36 crc kubenswrapper[4990]: I1003 11:16:36.153885 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-x4wxk" event={"ID":"78cd9c7b-bbc1-4b75-a466-d98ae13d1883","Type":"ContainerDied","Data":"5f06c8780de5edeafd30d47318b1842dc5f988de3c1fe333e12c2a61fcbd0078"} Oct 03 11:16:36 crc kubenswrapper[4990]: I1003 11:16:36.153929 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f06c8780de5edeafd30d47318b1842dc5f988de3c1fe333e12c2a61fcbd0078" Oct 03 11:16:36 crc kubenswrapper[4990]: I1003 11:16:36.154019 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-x4wxk" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.156827 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b723-account-create-vl7nv"] Oct 03 11:16:43 crc kubenswrapper[4990]: E1003 11:16:43.157776 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cd9c7b-bbc1-4b75-a466-d98ae13d1883" containerName="mariadb-database-create" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.157795 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cd9c7b-bbc1-4b75-a466-d98ae13d1883" containerName="mariadb-database-create" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.158007 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cd9c7b-bbc1-4b75-a466-d98ae13d1883" containerName="mariadb-database-create" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.158564 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b723-account-create-vl7nv" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.161246 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.170086 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b723-account-create-vl7nv"] Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.283116 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rf5q\" (UniqueName: \"kubernetes.io/projected/f4a57ab6-7281-437a-a89b-ea4e74b6cb6c-kube-api-access-8rf5q\") pod \"neutron-b723-account-create-vl7nv\" (UID: \"f4a57ab6-7281-437a-a89b-ea4e74b6cb6c\") " pod="openstack/neutron-b723-account-create-vl7nv" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.384528 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rf5q\" (UniqueName: \"kubernetes.io/projected/f4a57ab6-7281-437a-a89b-ea4e74b6cb6c-kube-api-access-8rf5q\") pod \"neutron-b723-account-create-vl7nv\" (UID: \"f4a57ab6-7281-437a-a89b-ea4e74b6cb6c\") " pod="openstack/neutron-b723-account-create-vl7nv" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.409320 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rf5q\" (UniqueName: \"kubernetes.io/projected/f4a57ab6-7281-437a-a89b-ea4e74b6cb6c-kube-api-access-8rf5q\") pod \"neutron-b723-account-create-vl7nv\" (UID: \"f4a57ab6-7281-437a-a89b-ea4e74b6cb6c\") " pod="openstack/neutron-b723-account-create-vl7nv" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.479554 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b723-account-create-vl7nv" Oct 03 11:16:43 crc kubenswrapper[4990]: I1003 11:16:43.930381 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b723-account-create-vl7nv"] Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.213376 4990 generic.go:334] "Generic (PLEG): container finished" podID="f4a57ab6-7281-437a-a89b-ea4e74b6cb6c" containerID="8688690f7c64f8f72db4158ca0da33060ef0a34441c0eea983ea642828f4e5c3" exitCode=0 Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.213422 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b723-account-create-vl7nv" event={"ID":"f4a57ab6-7281-437a-a89b-ea4e74b6cb6c","Type":"ContainerDied","Data":"8688690f7c64f8f72db4158ca0da33060ef0a34441c0eea983ea642828f4e5c3"} Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.213452 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b723-account-create-vl7nv" event={"ID":"f4a57ab6-7281-437a-a89b-ea4e74b6cb6c","Type":"ContainerStarted","Data":"a2c8a2bb852e254fc83ac608036d8027089f372c3f239222bbb6c3fd512fd64e"} Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.392753 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rj5px"] Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.395280 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.409143 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj5px"] Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.410381 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-utilities\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.410454 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-catalog-content\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.410482 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fft8f\" (UniqueName: \"kubernetes.io/projected/a3c4400d-ae00-4391-a3d0-6e944bc2137c-kube-api-access-fft8f\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.512514 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-utilities\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.512695 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-catalog-content\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.512797 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fft8f\" (UniqueName: \"kubernetes.io/projected/a3c4400d-ae00-4391-a3d0-6e944bc2137c-kube-api-access-fft8f\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.513160 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-utilities\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.513215 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-catalog-content\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.538882 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fft8f\" (UniqueName: \"kubernetes.io/projected/a3c4400d-ae00-4391-a3d0-6e944bc2137c-kube-api-access-fft8f\") pod \"redhat-marketplace-rj5px\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:44 crc kubenswrapper[4990]: I1003 11:16:44.775953 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:45 crc kubenswrapper[4990]: I1003 11:16:45.221681 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj5px"] Oct 03 11:16:45 crc kubenswrapper[4990]: W1003 11:16:45.233128 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c4400d_ae00_4391_a3d0_6e944bc2137c.slice/crio-724b028813afa9685bb7ed0b7c1bb319b4ecc76b7d3aa752b785fa03a7f628d8 WatchSource:0}: Error finding container 724b028813afa9685bb7ed0b7c1bb319b4ecc76b7d3aa752b785fa03a7f628d8: Status 404 returned error can't find the container with id 724b028813afa9685bb7ed0b7c1bb319b4ecc76b7d3aa752b785fa03a7f628d8 Oct 03 11:16:45 crc kubenswrapper[4990]: I1003 11:16:45.514886 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b723-account-create-vl7nv" Oct 03 11:16:45 crc kubenswrapper[4990]: I1003 11:16:45.533953 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rf5q\" (UniqueName: \"kubernetes.io/projected/f4a57ab6-7281-437a-a89b-ea4e74b6cb6c-kube-api-access-8rf5q\") pod \"f4a57ab6-7281-437a-a89b-ea4e74b6cb6c\" (UID: \"f4a57ab6-7281-437a-a89b-ea4e74b6cb6c\") " Oct 03 11:16:45 crc kubenswrapper[4990]: I1003 11:16:45.542469 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a57ab6-7281-437a-a89b-ea4e74b6cb6c-kube-api-access-8rf5q" (OuterVolumeSpecName: "kube-api-access-8rf5q") pod "f4a57ab6-7281-437a-a89b-ea4e74b6cb6c" (UID: "f4a57ab6-7281-437a-a89b-ea4e74b6cb6c"). InnerVolumeSpecName "kube-api-access-8rf5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:16:45 crc kubenswrapper[4990]: I1003 11:16:45.634980 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rf5q\" (UniqueName: \"kubernetes.io/projected/f4a57ab6-7281-437a-a89b-ea4e74b6cb6c-kube-api-access-8rf5q\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:46 crc kubenswrapper[4990]: I1003 11:16:46.230741 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b723-account-create-vl7nv" event={"ID":"f4a57ab6-7281-437a-a89b-ea4e74b6cb6c","Type":"ContainerDied","Data":"a2c8a2bb852e254fc83ac608036d8027089f372c3f239222bbb6c3fd512fd64e"} Oct 03 11:16:46 crc kubenswrapper[4990]: I1003 11:16:46.232001 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2c8a2bb852e254fc83ac608036d8027089f372c3f239222bbb6c3fd512fd64e" Oct 03 11:16:46 crc kubenswrapper[4990]: I1003 11:16:46.230784 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b723-account-create-vl7nv" Oct 03 11:16:46 crc kubenswrapper[4990]: I1003 11:16:46.232257 4990 generic.go:334] "Generic (PLEG): container finished" podID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerID="07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12" exitCode=0 Oct 03 11:16:46 crc kubenswrapper[4990]: I1003 11:16:46.232296 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj5px" event={"ID":"a3c4400d-ae00-4391-a3d0-6e944bc2137c","Type":"ContainerDied","Data":"07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12"} Oct 03 11:16:46 crc kubenswrapper[4990]: I1003 11:16:46.232328 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj5px" event={"ID":"a3c4400d-ae00-4391-a3d0-6e944bc2137c","Type":"ContainerStarted","Data":"724b028813afa9685bb7ed0b7c1bb319b4ecc76b7d3aa752b785fa03a7f628d8"} Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.265288 4990 generic.go:334] "Generic (PLEG): container finished" podID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerID="c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85" exitCode=0 Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.265744 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj5px" event={"ID":"a3c4400d-ae00-4391-a3d0-6e944bc2137c","Type":"ContainerDied","Data":"c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85"} Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.378684 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fnnfs"] Oct 03 11:16:48 crc kubenswrapper[4990]: E1003 11:16:48.379081 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a57ab6-7281-437a-a89b-ea4e74b6cb6c" containerName="mariadb-account-create" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.379093 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a57ab6-7281-437a-a89b-ea4e74b6cb6c" containerName="mariadb-account-create" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.379278 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a57ab6-7281-437a-a89b-ea4e74b6cb6c" containerName="mariadb-account-create" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.379837 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.381820 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.384830 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.385700 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xskwk" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.393671 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fnnfs"] Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.486257 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-config\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.486631 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nthr5\" (UniqueName: \"kubernetes.io/projected/bb20843f-2c82-4d5c-b069-7740e7af9777-kube-api-access-nthr5\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.486799 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-combined-ca-bundle\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.588135 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nthr5\" (UniqueName: \"kubernetes.io/projected/bb20843f-2c82-4d5c-b069-7740e7af9777-kube-api-access-nthr5\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.588238 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-combined-ca-bundle\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.588347 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-config\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.598283 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-config\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.598799 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-combined-ca-bundle\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.612674 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nthr5\" (UniqueName: \"kubernetes.io/projected/bb20843f-2c82-4d5c-b069-7740e7af9777-kube-api-access-nthr5\") pod \"neutron-db-sync-fnnfs\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:48 crc kubenswrapper[4990]: I1003 11:16:48.701764 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:49 crc kubenswrapper[4990]: I1003 11:16:49.251886 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fnnfs"] Oct 03 11:16:49 crc kubenswrapper[4990]: W1003 11:16:49.255618 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb20843f_2c82_4d5c_b069_7740e7af9777.slice/crio-d42d010a545510ceb4cc324fc0d304eee59102df86af01df255d40dded8ed608 WatchSource:0}: Error finding container d42d010a545510ceb4cc324fc0d304eee59102df86af01df255d40dded8ed608: Status 404 returned error can't find the container with id d42d010a545510ceb4cc324fc0d304eee59102df86af01df255d40dded8ed608 Oct 03 11:16:49 crc kubenswrapper[4990]: I1003 11:16:49.281717 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj5px" event={"ID":"a3c4400d-ae00-4391-a3d0-6e944bc2137c","Type":"ContainerStarted","Data":"18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99"} Oct 03 11:16:49 crc kubenswrapper[4990]: I1003 11:16:49.283202 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fnnfs" event={"ID":"bb20843f-2c82-4d5c-b069-7740e7af9777","Type":"ContainerStarted","Data":"d42d010a545510ceb4cc324fc0d304eee59102df86af01df255d40dded8ed608"} Oct 03 11:16:49 crc kubenswrapper[4990]: I1003 11:16:49.307393 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rj5px" podStartSLOduration=2.778324882 podStartE2EDuration="5.307371227s" podCreationTimestamp="2025-10-03 11:16:44 +0000 UTC" firstStartedPulling="2025-10-03 11:16:46.233974985 +0000 UTC m=+5588.030606842" lastFinishedPulling="2025-10-03 11:16:48.763010409 +0000 UTC m=+5590.559653187" observedRunningTime="2025-10-03 11:16:49.307327846 +0000 UTC m=+5591.103959713" watchObservedRunningTime="2025-10-03 11:16:49.307371227 +0000 UTC m=+5591.104003094" Oct 03 11:16:50 crc kubenswrapper[4990]: I1003 11:16:50.296028 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fnnfs" event={"ID":"bb20843f-2c82-4d5c-b069-7740e7af9777","Type":"ContainerStarted","Data":"def4c4a1ecfd8764dd90657ae49e91d3ed0c784a2332913c1ffc040c0c2aafc4"} Oct 03 11:16:50 crc kubenswrapper[4990]: I1003 11:16:50.314967 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fnnfs" podStartSLOduration=2.31494597 podStartE2EDuration="2.31494597s" podCreationTimestamp="2025-10-03 11:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:50.312627601 +0000 UTC m=+5592.109259478" watchObservedRunningTime="2025-10-03 11:16:50.31494597 +0000 UTC m=+5592.111577827" Oct 03 11:16:54 crc kubenswrapper[4990]: I1003 11:16:54.346295 4990 generic.go:334] "Generic (PLEG): container finished" podID="bb20843f-2c82-4d5c-b069-7740e7af9777" containerID="def4c4a1ecfd8764dd90657ae49e91d3ed0c784a2332913c1ffc040c0c2aafc4" exitCode=0 Oct 03 11:16:54 crc kubenswrapper[4990]: I1003 11:16:54.346375 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fnnfs" event={"ID":"bb20843f-2c82-4d5c-b069-7740e7af9777","Type":"ContainerDied","Data":"def4c4a1ecfd8764dd90657ae49e91d3ed0c784a2332913c1ffc040c0c2aafc4"} Oct 03 11:16:54 crc kubenswrapper[4990]: I1003 11:16:54.776866 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:54 crc kubenswrapper[4990]: I1003 11:16:54.776937 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:54 crc kubenswrapper[4990]: I1003 11:16:54.851346 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.404163 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.458434 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj5px"] Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.692733 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.828596 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-combined-ca-bundle\") pod \"bb20843f-2c82-4d5c-b069-7740e7af9777\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.828657 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nthr5\" (UniqueName: \"kubernetes.io/projected/bb20843f-2c82-4d5c-b069-7740e7af9777-kube-api-access-nthr5\") pod \"bb20843f-2c82-4d5c-b069-7740e7af9777\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.828816 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-config\") pod \"bb20843f-2c82-4d5c-b069-7740e7af9777\" (UID: \"bb20843f-2c82-4d5c-b069-7740e7af9777\") " Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.835458 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb20843f-2c82-4d5c-b069-7740e7af9777-kube-api-access-nthr5" (OuterVolumeSpecName: "kube-api-access-nthr5") pod "bb20843f-2c82-4d5c-b069-7740e7af9777" (UID: "bb20843f-2c82-4d5c-b069-7740e7af9777"). InnerVolumeSpecName "kube-api-access-nthr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.853976 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-config" (OuterVolumeSpecName: "config") pod "bb20843f-2c82-4d5c-b069-7740e7af9777" (UID: "bb20843f-2c82-4d5c-b069-7740e7af9777"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.863338 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb20843f-2c82-4d5c-b069-7740e7af9777" (UID: "bb20843f-2c82-4d5c-b069-7740e7af9777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.932264 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.932295 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nthr5\" (UniqueName: \"kubernetes.io/projected/bb20843f-2c82-4d5c-b069-7740e7af9777-kube-api-access-nthr5\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:55 crc kubenswrapper[4990]: I1003 11:16:55.932312 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb20843f-2c82-4d5c-b069-7740e7af9777-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.410961 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fnnfs" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.411227 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fnnfs" event={"ID":"bb20843f-2c82-4d5c-b069-7740e7af9777","Type":"ContainerDied","Data":"d42d010a545510ceb4cc324fc0d304eee59102df86af01df255d40dded8ed608"} Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.411465 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42d010a545510ceb4cc324fc0d304eee59102df86af01df255d40dded8ed608" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.628845 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d954d6867-dtm7x"] Oct 03 11:16:56 crc kubenswrapper[4990]: E1003 11:16:56.629304 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb20843f-2c82-4d5c-b069-7740e7af9777" containerName="neutron-db-sync" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.629327 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb20843f-2c82-4d5c-b069-7740e7af9777" containerName="neutron-db-sync" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.631455 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb20843f-2c82-4d5c-b069-7740e7af9777" containerName="neutron-db-sync" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.632679 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.642535 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d954d6867-dtm7x"] Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.696918 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7896899cb8-g24bj"] Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.708861 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7896899cb8-g24bj"] Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.708958 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.712687 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.713725 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.713893 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.714054 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xskwk" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.748919 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.748978 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.749100 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-dns-svc\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.749157 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-config\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.749327 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rkg\" (UniqueName: \"kubernetes.io/projected/79caa14e-728c-4592-99f7-6779ece3603c-kube-api-access-b4rkg\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.850825 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.850871 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.850913 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-ovndb-tls-certs\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.850964 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-config\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.850983 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-combined-ca-bundle\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.851025 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-dns-svc\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.851067 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-config\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.851099 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpqp\" (UniqueName: \"kubernetes.io/projected/fcbedf11-27cd-4e84-997b-a1533f9425b6-kube-api-access-whpqp\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.851126 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-httpd-config\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.851146 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rkg\" (UniqueName: \"kubernetes.io/projected/79caa14e-728c-4592-99f7-6779ece3603c-kube-api-access-b4rkg\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.852288 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.852618 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.852865 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-dns-svc\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.853307 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-config\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.875423 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rkg\" (UniqueName: \"kubernetes.io/projected/79caa14e-728c-4592-99f7-6779ece3603c-kube-api-access-b4rkg\") pod \"dnsmasq-dns-5d954d6867-dtm7x\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.952306 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpqp\" (UniqueName: \"kubernetes.io/projected/fcbedf11-27cd-4e84-997b-a1533f9425b6-kube-api-access-whpqp\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.952368 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-httpd-config\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.952421 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-ovndb-tls-certs\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.952459 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-config\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.952477 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-combined-ca-bundle\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.952698 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.956941 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-httpd-config\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.957110 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-ovndb-tls-certs\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.957363 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-combined-ca-bundle\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.958416 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-config\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:56 crc kubenswrapper[4990]: I1003 11:16:56.970433 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpqp\" (UniqueName: \"kubernetes.io/projected/fcbedf11-27cd-4e84-997b-a1533f9425b6-kube-api-access-whpqp\") pod \"neutron-7896899cb8-g24bj\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.038240 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.416834 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rj5px" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerName="registry-server" containerID="cri-o://18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99" gracePeriod=2 Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.479079 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d954d6867-dtm7x"] Oct 03 11:16:57 crc kubenswrapper[4990]: W1003 11:16:57.525219 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79caa14e_728c_4592_99f7_6779ece3603c.slice/crio-6358c09d6f98bef7ca10b4c7e5541c7e0d03ce4a1385e75b9fef5c264c5a288d WatchSource:0}: Error finding container 6358c09d6f98bef7ca10b4c7e5541c7e0d03ce4a1385e75b9fef5c264c5a288d: Status 404 returned error can't find the container with id 6358c09d6f98bef7ca10b4c7e5541c7e0d03ce4a1385e75b9fef5c264c5a288d Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.743857 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7896899cb8-g24bj"] Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.843099 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.972190 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-utilities\") pod \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.972344 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fft8f\" (UniqueName: \"kubernetes.io/projected/a3c4400d-ae00-4391-a3d0-6e944bc2137c-kube-api-access-fft8f\") pod \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.972471 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-catalog-content\") pod \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\" (UID: \"a3c4400d-ae00-4391-a3d0-6e944bc2137c\") " Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.973407 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-utilities" (OuterVolumeSpecName: "utilities") pod "a3c4400d-ae00-4391-a3d0-6e944bc2137c" (UID: "a3c4400d-ae00-4391-a3d0-6e944bc2137c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.975240 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:57 crc kubenswrapper[4990]: I1003 11:16:57.977705 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c4400d-ae00-4391-a3d0-6e944bc2137c-kube-api-access-fft8f" (OuterVolumeSpecName: "kube-api-access-fft8f") pod "a3c4400d-ae00-4391-a3d0-6e944bc2137c" (UID: "a3c4400d-ae00-4391-a3d0-6e944bc2137c"). InnerVolumeSpecName "kube-api-access-fft8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.004278 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3c4400d-ae00-4391-a3d0-6e944bc2137c" (UID: "a3c4400d-ae00-4391-a3d0-6e944bc2137c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.076590 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fft8f\" (UniqueName: \"kubernetes.io/projected/a3c4400d-ae00-4391-a3d0-6e944bc2137c-kube-api-access-fft8f\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.076623 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c4400d-ae00-4391-a3d0-6e944bc2137c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.426275 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7896899cb8-g24bj" event={"ID":"fcbedf11-27cd-4e84-997b-a1533f9425b6","Type":"ContainerStarted","Data":"c7c995dd130b6768d5d5cb4b64a814954547bcee9e894605854f70f2b3b519b6"} Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.426865 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7896899cb8-g24bj" event={"ID":"fcbedf11-27cd-4e84-997b-a1533f9425b6","Type":"ContainerStarted","Data":"865e553cee902bd3fab3628a511fc97521c4b8047ac864fdfa8536d7d1df35f2"} Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.426904 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.426917 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7896899cb8-g24bj" event={"ID":"fcbedf11-27cd-4e84-997b-a1533f9425b6","Type":"ContainerStarted","Data":"f779cddef1a8f4032bec012e8fa67b7dbd9c4e80aefceeec9f1aff28c866e44d"} Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.427710 4990 generic.go:334] "Generic (PLEG): container finished" podID="79caa14e-728c-4592-99f7-6779ece3603c" containerID="67423d40a7731a7603c1d3faec513b72cae043b6091331d72838320c19244dc2" exitCode=0 Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.427759 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" event={"ID":"79caa14e-728c-4592-99f7-6779ece3603c","Type":"ContainerDied","Data":"67423d40a7731a7603c1d3faec513b72cae043b6091331d72838320c19244dc2"} Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.427824 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" event={"ID":"79caa14e-728c-4592-99f7-6779ece3603c","Type":"ContainerStarted","Data":"6358c09d6f98bef7ca10b4c7e5541c7e0d03ce4a1385e75b9fef5c264c5a288d"} Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.430660 4990 generic.go:334] "Generic (PLEG): container finished" podID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerID="18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99" exitCode=0 Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.430745 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj5px" event={"ID":"a3c4400d-ae00-4391-a3d0-6e944bc2137c","Type":"ContainerDied","Data":"18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99"} Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.430810 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rj5px" event={"ID":"a3c4400d-ae00-4391-a3d0-6e944bc2137c","Type":"ContainerDied","Data":"724b028813afa9685bb7ed0b7c1bb319b4ecc76b7d3aa752b785fa03a7f628d8"} Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.430856 4990 scope.go:117] "RemoveContainer" containerID="18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.430769 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rj5px" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.454676 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7896899cb8-g24bj" podStartSLOduration=2.4546517420000002 podStartE2EDuration="2.454651742s" podCreationTimestamp="2025-10-03 11:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:58.450325451 +0000 UTC m=+5600.246957328" watchObservedRunningTime="2025-10-03 11:16:58.454651742 +0000 UTC m=+5600.251283599" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.525976 4990 scope.go:117] "RemoveContainer" containerID="c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.528749 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj5px"] Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.538255 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rj5px"] Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.579151 4990 scope.go:117] "RemoveContainer" containerID="07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.613596 4990 scope.go:117] "RemoveContainer" containerID="18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99" Oct 03 11:16:58 crc kubenswrapper[4990]: E1003 11:16:58.614088 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99\": container with ID starting with 18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99 not found: ID does not exist" containerID="18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.614117 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99"} err="failed to get container status \"18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99\": rpc error: code = NotFound desc = could not find container \"18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99\": container with ID starting with 18cc074d6d8fedf69277a6875e00ab65bb9c8f862ac08cb0141e25f5d3785e99 not found: ID does not exist" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.614135 4990 scope.go:117] "RemoveContainer" containerID="c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85" Oct 03 11:16:58 crc kubenswrapper[4990]: E1003 11:16:58.614642 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85\": container with ID starting with c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85 not found: ID does not exist" containerID="c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.614667 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85"} err="failed to get container status \"c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85\": rpc error: code = NotFound desc = could not find container \"c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85\": container with ID starting with c53bf79e86a1bd048939c6b0ef5e631a6a1745a61c3af5f8e3835b3cd8a99e85 not found: ID does not exist" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.614680 4990 scope.go:117] "RemoveContainer" containerID="07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12" Oct 03 11:16:58 crc kubenswrapper[4990]: E1003 11:16:58.614923 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12\": container with ID starting with 07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12 not found: ID does not exist" containerID="07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.614940 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12"} err="failed to get container status \"07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12\": rpc error: code = NotFound desc = could not find container \"07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12\": container with ID starting with 07866aa85cef66bef5fc398228ad3df44b6b9a74fe744481f3606fc126712f12 not found: ID does not exist" Oct 03 11:16:58 crc kubenswrapper[4990]: I1003 11:16:58.892157 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" path="/var/lib/kubelet/pods/a3c4400d-ae00-4391-a3d0-6e944bc2137c/volumes" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.440126 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" event={"ID":"79caa14e-728c-4592-99f7-6779ece3603c","Type":"ContainerStarted","Data":"5e97c2eab106ae6256d5c8598ed04045801bf712a1727dcb23afcd60619e343b"} Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.440549 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.483749 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" podStartSLOduration=3.483728888 podStartE2EDuration="3.483728888s" podCreationTimestamp="2025-10-03 11:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:16:59.476753239 +0000 UTC m=+5601.273385116" watchObservedRunningTime="2025-10-03 11:16:59.483728888 +0000 UTC m=+5601.280360745" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.735941 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-844c8c7569-rln6q"] Oct 03 11:16:59 crc kubenswrapper[4990]: E1003 11:16:59.736394 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerName="registry-server" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.736412 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerName="registry-server" Oct 03 11:16:59 crc kubenswrapper[4990]: E1003 11:16:59.736468 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerName="extract-content" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.736476 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerName="extract-content" Oct 03 11:16:59 crc kubenswrapper[4990]: E1003 11:16:59.736486 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerName="extract-utilities" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.736493 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerName="extract-utilities" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.736881 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c4400d-ae00-4391-a3d0-6e944bc2137c" containerName="registry-server" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.741324 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.743277 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.743670 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.748498 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-844c8c7569-rln6q"] Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.804656 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-httpd-config\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.804715 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-combined-ca-bundle\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.804757 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-ovndb-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.804791 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-public-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.804864 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-config\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.804899 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrnfh\" (UniqueName: \"kubernetes.io/projected/8e437b5a-e091-4f97-94d4-4a490365e5fa-kube-api-access-xrnfh\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.804947 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-internal-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.906235 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-config\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.906292 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrnfh\" (UniqueName: \"kubernetes.io/projected/8e437b5a-e091-4f97-94d4-4a490365e5fa-kube-api-access-xrnfh\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.906333 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-internal-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.906415 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-httpd-config\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.906439 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-combined-ca-bundle\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.906475 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-ovndb-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.906502 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-public-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.912186 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-combined-ca-bundle\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.912803 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-internal-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.913228 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-public-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.915090 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-httpd-config\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.917602 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-config\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.923630 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e437b5a-e091-4f97-94d4-4a490365e5fa-ovndb-tls-certs\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:16:59 crc kubenswrapper[4990]: I1003 11:16:59.937438 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrnfh\" (UniqueName: \"kubernetes.io/projected/8e437b5a-e091-4f97-94d4-4a490365e5fa-kube-api-access-xrnfh\") pod \"neutron-844c8c7569-rln6q\" (UID: \"8e437b5a-e091-4f97-94d4-4a490365e5fa\") " pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:17:00 crc kubenswrapper[4990]: I1003 11:17:00.110621 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:17:00 crc kubenswrapper[4990]: I1003 11:17:00.634410 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-844c8c7569-rln6q"] Oct 03 11:17:00 crc kubenswrapper[4990]: W1003 11:17:00.651097 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e437b5a_e091_4f97_94d4_4a490365e5fa.slice/crio-d946333cb450050cf3164e1d001db0f2a66c4d97056b52e718e8344fad1528a2 WatchSource:0}: Error finding container d946333cb450050cf3164e1d001db0f2a66c4d97056b52e718e8344fad1528a2: Status 404 returned error can't find the container with id d946333cb450050cf3164e1d001db0f2a66c4d97056b52e718e8344fad1528a2 Oct 03 11:17:01 crc kubenswrapper[4990]: I1003 11:17:01.457992 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844c8c7569-rln6q" event={"ID":"8e437b5a-e091-4f97-94d4-4a490365e5fa","Type":"ContainerStarted","Data":"265259430b9fc1b2bedbb95c2661a446e67a17808164430b3a2bbdfbe4a2c7c3"} Oct 03 11:17:01 crc kubenswrapper[4990]: I1003 11:17:01.458033 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844c8c7569-rln6q" event={"ID":"8e437b5a-e091-4f97-94d4-4a490365e5fa","Type":"ContainerStarted","Data":"1bb8b2d514cc78795d09bbeeaf2f473439f9d5822d5584253a89a66a53a63c8b"} Oct 03 11:17:01 crc kubenswrapper[4990]: I1003 11:17:01.458042 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844c8c7569-rln6q" event={"ID":"8e437b5a-e091-4f97-94d4-4a490365e5fa","Type":"ContainerStarted","Data":"d946333cb450050cf3164e1d001db0f2a66c4d97056b52e718e8344fad1528a2"} Oct 03 11:17:01 crc kubenswrapper[4990]: I1003 11:17:01.458613 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:17:01 crc kubenswrapper[4990]: I1003 11:17:01.479876 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-844c8c7569-rln6q" podStartSLOduration=2.479858928 podStartE2EDuration="2.479858928s" podCreationTimestamp="2025-10-03 11:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:17:01.478308988 +0000 UTC m=+5603.274940845" watchObservedRunningTime="2025-10-03 11:17:01.479858928 +0000 UTC m=+5603.276490785" Oct 03 11:17:06 crc kubenswrapper[4990]: I1003 11:17:06.955899 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.018665 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b8f884bf9-4dfpg"] Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.018895 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" podUID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerName="dnsmasq-dns" containerID="cri-o://60925d16550e1f42fd4f9244844d930d27a5bf3a2967905822bb9a501f4b819a" gracePeriod=10 Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.509141 4990 generic.go:334] "Generic (PLEG): container finished" podID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerID="60925d16550e1f42fd4f9244844d930d27a5bf3a2967905822bb9a501f4b819a" exitCode=0 Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.509367 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" event={"ID":"bf17efcf-a86c-4335-bb56-3b52134a7ff6","Type":"ContainerDied","Data":"60925d16550e1f42fd4f9244844d930d27a5bf3a2967905822bb9a501f4b819a"} Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.509498 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" event={"ID":"bf17efcf-a86c-4335-bb56-3b52134a7ff6","Type":"ContainerDied","Data":"59224563cd572557782a396412f3e526862cc84e7844b9b010a4e0c26f9b2d89"} Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.509532 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59224563cd572557782a396412f3e526862cc84e7844b9b010a4e0c26f9b2d89" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.526626 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.643796 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-dns-svc\") pod \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.644192 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-nb\") pod \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.644244 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-config\") pod \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.644280 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-sb\") pod \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.644417 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95hqk\" (UniqueName: \"kubernetes.io/projected/bf17efcf-a86c-4335-bb56-3b52134a7ff6-kube-api-access-95hqk\") pod \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\" (UID: \"bf17efcf-a86c-4335-bb56-3b52134a7ff6\") " Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.650345 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf17efcf-a86c-4335-bb56-3b52134a7ff6-kube-api-access-95hqk" (OuterVolumeSpecName: "kube-api-access-95hqk") pod "bf17efcf-a86c-4335-bb56-3b52134a7ff6" (UID: "bf17efcf-a86c-4335-bb56-3b52134a7ff6"). InnerVolumeSpecName "kube-api-access-95hqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.698711 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf17efcf-a86c-4335-bb56-3b52134a7ff6" (UID: "bf17efcf-a86c-4335-bb56-3b52134a7ff6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.702045 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-config" (OuterVolumeSpecName: "config") pod "bf17efcf-a86c-4335-bb56-3b52134a7ff6" (UID: "bf17efcf-a86c-4335-bb56-3b52134a7ff6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.702386 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf17efcf-a86c-4335-bb56-3b52134a7ff6" (UID: "bf17efcf-a86c-4335-bb56-3b52134a7ff6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.706814 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf17efcf-a86c-4335-bb56-3b52134a7ff6" (UID: "bf17efcf-a86c-4335-bb56-3b52134a7ff6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.747110 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.747146 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.747157 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.747168 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95hqk\" (UniqueName: \"kubernetes.io/projected/bf17efcf-a86c-4335-bb56-3b52134a7ff6-kube-api-access-95hqk\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:07 crc kubenswrapper[4990]: I1003 11:17:07.747178 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf17efcf-a86c-4335-bb56-3b52134a7ff6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:08 crc kubenswrapper[4990]: I1003 11:17:08.518601 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" Oct 03 11:17:08 crc kubenswrapper[4990]: I1003 11:17:08.553701 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b8f884bf9-4dfpg"] Oct 03 11:17:08 crc kubenswrapper[4990]: I1003 11:17:08.561055 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b8f884bf9-4dfpg"] Oct 03 11:17:08 crc kubenswrapper[4990]: I1003 11:17:08.887086 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" path="/var/lib/kubelet/pods/bf17efcf-a86c-4335-bb56-3b52134a7ff6/volumes" Oct 03 11:17:12 crc kubenswrapper[4990]: I1003 11:17:12.501997 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b8f884bf9-4dfpg" podUID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.35:5353: i/o timeout" Oct 03 11:17:27 crc kubenswrapper[4990]: I1003 11:17:27.046661 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:17:30 crc kubenswrapper[4990]: I1003 11:17:30.133343 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-844c8c7569-rln6q" Oct 03 11:17:30 crc kubenswrapper[4990]: I1003 11:17:30.218206 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7896899cb8-g24bj"] Oct 03 11:17:30 crc kubenswrapper[4990]: I1003 11:17:30.218595 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7896899cb8-g24bj" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerName="neutron-api" containerID="cri-o://865e553cee902bd3fab3628a511fc97521c4b8047ac864fdfa8536d7d1df35f2" gracePeriod=30 Oct 03 11:17:30 crc kubenswrapper[4990]: I1003 11:17:30.218766 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7896899cb8-g24bj" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerName="neutron-httpd" containerID="cri-o://c7c995dd130b6768d5d5cb4b64a814954547bcee9e894605854f70f2b3b519b6" gracePeriod=30 Oct 03 11:17:31 crc kubenswrapper[4990]: I1003 11:17:31.738676 4990 generic.go:334] "Generic (PLEG): container finished" podID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerID="c7c995dd130b6768d5d5cb4b64a814954547bcee9e894605854f70f2b3b519b6" exitCode=0 Oct 03 11:17:31 crc kubenswrapper[4990]: I1003 11:17:31.738712 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7896899cb8-g24bj" event={"ID":"fcbedf11-27cd-4e84-997b-a1533f9425b6","Type":"ContainerDied","Data":"c7c995dd130b6768d5d5cb4b64a814954547bcee9e894605854f70f2b3b519b6"} Oct 03 11:17:34 crc kubenswrapper[4990]: I1003 11:17:34.769071 4990 generic.go:334] "Generic (PLEG): container finished" podID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerID="865e553cee902bd3fab3628a511fc97521c4b8047ac864fdfa8536d7d1df35f2" exitCode=0 Oct 03 11:17:34 crc kubenswrapper[4990]: I1003 11:17:34.769196 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7896899cb8-g24bj" event={"ID":"fcbedf11-27cd-4e84-997b-a1533f9425b6","Type":"ContainerDied","Data":"865e553cee902bd3fab3628a511fc97521c4b8047ac864fdfa8536d7d1df35f2"} Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.322195 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.470127 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-httpd-config\") pod \"fcbedf11-27cd-4e84-997b-a1533f9425b6\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.470207 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-ovndb-tls-certs\") pod \"fcbedf11-27cd-4e84-997b-a1533f9425b6\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.470355 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whpqp\" (UniqueName: \"kubernetes.io/projected/fcbedf11-27cd-4e84-997b-a1533f9425b6-kube-api-access-whpqp\") pod \"fcbedf11-27cd-4e84-997b-a1533f9425b6\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.470482 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-combined-ca-bundle\") pod \"fcbedf11-27cd-4e84-997b-a1533f9425b6\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.470571 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-config\") pod \"fcbedf11-27cd-4e84-997b-a1533f9425b6\" (UID: \"fcbedf11-27cd-4e84-997b-a1533f9425b6\") " Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.476314 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fcbedf11-27cd-4e84-997b-a1533f9425b6" (UID: "fcbedf11-27cd-4e84-997b-a1533f9425b6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.490167 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcbedf11-27cd-4e84-997b-a1533f9425b6-kube-api-access-whpqp" (OuterVolumeSpecName: "kube-api-access-whpqp") pod "fcbedf11-27cd-4e84-997b-a1533f9425b6" (UID: "fcbedf11-27cd-4e84-997b-a1533f9425b6"). InnerVolumeSpecName "kube-api-access-whpqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.517044 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcbedf11-27cd-4e84-997b-a1533f9425b6" (UID: "fcbedf11-27cd-4e84-997b-a1533f9425b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.526576 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-config" (OuterVolumeSpecName: "config") pod "fcbedf11-27cd-4e84-997b-a1533f9425b6" (UID: "fcbedf11-27cd-4e84-997b-a1533f9425b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.572590 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whpqp\" (UniqueName: \"kubernetes.io/projected/fcbedf11-27cd-4e84-997b-a1533f9425b6-kube-api-access-whpqp\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.572625 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.572638 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.572650 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.574960 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fcbedf11-27cd-4e84-997b-a1533f9425b6" (UID: "fcbedf11-27cd-4e84-997b-a1533f9425b6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.674005 4990 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbedf11-27cd-4e84-997b-a1533f9425b6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.777787 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7896899cb8-g24bj" event={"ID":"fcbedf11-27cd-4e84-997b-a1533f9425b6","Type":"ContainerDied","Data":"f779cddef1a8f4032bec012e8fa67b7dbd9c4e80aefceeec9f1aff28c866e44d"} Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.777821 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7896899cb8-g24bj" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.777873 4990 scope.go:117] "RemoveContainer" containerID="c7c995dd130b6768d5d5cb4b64a814954547bcee9e894605854f70f2b3b519b6" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.810844 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7896899cb8-g24bj"] Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.820391 4990 scope.go:117] "RemoveContainer" containerID="865e553cee902bd3fab3628a511fc97521c4b8047ac864fdfa8536d7d1df35f2" Oct 03 11:17:35 crc kubenswrapper[4990]: I1003 11:17:35.826192 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7896899cb8-g24bj"] Oct 03 11:17:36 crc kubenswrapper[4990]: I1003 11:17:36.900537 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" path="/var/lib/kubelet/pods/fcbedf11-27cd-4e84-997b-a1533f9425b6/volumes" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.624610 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2dmzm"] Oct 03 11:17:45 crc kubenswrapper[4990]: E1003 11:17:45.625292 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerName="init" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.625304 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerName="init" Oct 03 11:17:45 crc kubenswrapper[4990]: E1003 11:17:45.625324 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerName="neutron-httpd" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.625330 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerName="neutron-httpd" Oct 03 11:17:45 crc kubenswrapper[4990]: E1003 11:17:45.625350 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerName="neutron-api" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.625357 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerName="neutron-api" Oct 03 11:17:45 crc kubenswrapper[4990]: E1003 11:17:45.625372 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerName="dnsmasq-dns" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.625378 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerName="dnsmasq-dns" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.625528 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerName="neutron-api" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.625536 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf17efcf-a86c-4335-bb56-3b52134a7ff6" containerName="dnsmasq-dns" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.625558 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcbedf11-27cd-4e84-997b-a1533f9425b6" containerName="neutron-httpd" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.626074 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.632522 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.632821 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.632954 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6w568" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.633100 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.633227 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.651959 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-scripts\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.652001 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-swiftconf\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.652025 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-ring-data-devices\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.652210 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9964f731-6bb0-46e1-879c-2201ad1c280c-etc-swift\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.652283 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-combined-ca-bundle\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.652683 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrcg8\" (UniqueName: \"kubernetes.io/projected/9964f731-6bb0-46e1-879c-2201ad1c280c-kube-api-access-xrcg8\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.652716 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-dispersionconf\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.668607 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2dmzm"] Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.732084 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f55767c85-mmg7w"] Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.733388 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.748013 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f55767c85-mmg7w"] Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.756771 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9964f731-6bb0-46e1-879c-2201ad1c280c-etc-swift\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.756822 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-combined-ca-bundle\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.756871 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-sb\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.756907 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrcg8\" (UniqueName: \"kubernetes.io/projected/9964f731-6bb0-46e1-879c-2201ad1c280c-kube-api-access-xrcg8\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.756924 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-dispersionconf\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.756951 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-config\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.756995 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6lc\" (UniqueName: \"kubernetes.io/projected/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-kube-api-access-5c6lc\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.757037 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-dns-svc\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.757066 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-scripts\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.757084 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-nb\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.757104 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-swiftconf\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.757122 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-ring-data-devices\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.757176 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9964f731-6bb0-46e1-879c-2201ad1c280c-etc-swift\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.758564 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-scripts\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.759119 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-ring-data-devices\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.762646 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-dispersionconf\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.771324 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-swiftconf\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.779379 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrcg8\" (UniqueName: \"kubernetes.io/projected/9964f731-6bb0-46e1-879c-2201ad1c280c-kube-api-access-xrcg8\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.790215 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-combined-ca-bundle\") pod \"swift-ring-rebalance-2dmzm\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.858134 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6lc\" (UniqueName: \"kubernetes.io/projected/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-kube-api-access-5c6lc\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.858204 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-dns-svc\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.858231 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-nb\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.858296 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-sb\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.858324 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-config\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.859071 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-config\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.859320 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-dns-svc\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.860041 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-sb\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.860238 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-nb\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.896766 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6lc\" (UniqueName: \"kubernetes.io/projected/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-kube-api-access-5c6lc\") pod \"dnsmasq-dns-f55767c85-mmg7w\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:45 crc kubenswrapper[4990]: I1003 11:17:45.945631 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.054116 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.443371 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2dmzm"] Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.605700 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f55767c85-mmg7w"] Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.916365 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2dmzm" event={"ID":"9964f731-6bb0-46e1-879c-2201ad1c280c","Type":"ContainerStarted","Data":"493832d76850fc600c1fd51dab86155f92a10632b35f35e19f56d2b4e774a59c"} Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.916658 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2dmzm" event={"ID":"9964f731-6bb0-46e1-879c-2201ad1c280c","Type":"ContainerStarted","Data":"2359db0d1adf218f40d75a72a166214226452d140564d5c9a5884df67a18fd7b"} Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.918911 4990 generic.go:334] "Generic (PLEG): container finished" podID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" containerID="38cf246f6f532d86e2687493edd62607f2c2cebf46f90fd2071a02e221d03976" exitCode=0 Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.918949 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" event={"ID":"7072233f-3e4a-4dd0-b857-4a107eb3d1e3","Type":"ContainerDied","Data":"38cf246f6f532d86e2687493edd62607f2c2cebf46f90fd2071a02e221d03976"} Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.918965 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" event={"ID":"7072233f-3e4a-4dd0-b857-4a107eb3d1e3","Type":"ContainerStarted","Data":"3f4f58dbf4c3226816b84cd8a017b23cce8a21a3e414075881aa4ce4cf01e472"} Oct 03 11:17:46 crc kubenswrapper[4990]: I1003 11:17:46.934131 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2dmzm" podStartSLOduration=1.9341139649999999 podStartE2EDuration="1.934113965s" podCreationTimestamp="2025-10-03 11:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:17:46.929359783 +0000 UTC m=+5648.725991640" watchObservedRunningTime="2025-10-03 11:17:46.934113965 +0000 UTC m=+5648.730745822" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.813778 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-769fc986df-85l96"] Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.816129 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.818268 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.843668 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-769fc986df-85l96"] Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.896042 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-log-httpd\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.896157 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gk25\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-kube-api-access-2gk25\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.897353 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-run-httpd\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.897419 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-config-data\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.897559 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-etc-swift\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.897887 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-combined-ca-bundle\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.931509 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" event={"ID":"7072233f-3e4a-4dd0-b857-4a107eb3d1e3","Type":"ContainerStarted","Data":"1867575ee49c1c0ef0338fa9444c73c2615236cdc96a2566c35141a06b7d555c"} Oct 03 11:17:47 crc kubenswrapper[4990]: I1003 11:17:47.931584 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.000385 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-run-httpd\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.000443 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-config-data\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.000511 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-etc-swift\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.000541 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-combined-ca-bundle\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.000637 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-log-httpd\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.000672 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gk25\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-kube-api-access-2gk25\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.001911 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-log-httpd\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.002371 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-run-httpd\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.007325 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-etc-swift\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.015413 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-config-data\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.021113 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-combined-ca-bundle\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.027324 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gk25\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-kube-api-access-2gk25\") pod \"swift-proxy-769fc986df-85l96\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.135252 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.790696 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" podStartSLOduration=3.790676577 podStartE2EDuration="3.790676577s" podCreationTimestamp="2025-10-03 11:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:17:47.960889181 +0000 UTC m=+5649.757521038" watchObservedRunningTime="2025-10-03 11:17:48.790676577 +0000 UTC m=+5650.587308434" Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.794351 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-769fc986df-85l96"] Oct 03 11:17:48 crc kubenswrapper[4990]: I1003 11:17:48.938465 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-769fc986df-85l96" event={"ID":"f8682c98-30ae-4582-93f3-7f5629760e0a","Type":"ContainerStarted","Data":"4f68aa82f7085b856784ac7063dec936393f72bd7044d4f70d08ac06ef089a55"} Oct 03 11:17:49 crc kubenswrapper[4990]: I1003 11:17:49.949216 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-769fc986df-85l96" event={"ID":"f8682c98-30ae-4582-93f3-7f5629760e0a","Type":"ContainerStarted","Data":"f4c49d631ad99ff2a16fb44ff277c1265bdbe18f99005914a387d10dcbeb9c4a"} Oct 03 11:17:49 crc kubenswrapper[4990]: I1003 11:17:49.949643 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-769fc986df-85l96" event={"ID":"f8682c98-30ae-4582-93f3-7f5629760e0a","Type":"ContainerStarted","Data":"5d3c01c3ba4ca3022baf02dfe36b6b6892ad427f93b6eb6ac8ae326fcb77f138"} Oct 03 11:17:49 crc kubenswrapper[4990]: I1003 11:17:49.949667 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:49 crc kubenswrapper[4990]: I1003 11:17:49.969545 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-769fc986df-85l96" podStartSLOduration=2.969510412 podStartE2EDuration="2.969510412s" podCreationTimestamp="2025-10-03 11:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:17:49.967686965 +0000 UTC m=+5651.764318832" watchObservedRunningTime="2025-10-03 11:17:49.969510412 +0000 UTC m=+5651.766142269" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.381412 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-686dbdfb85-nlj9c"] Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.382924 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.384799 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.386176 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.400633 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-686dbdfb85-nlj9c"] Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.449710 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9153262d-7256-4581-b8d7-bd3575ece8b1-log-httpd\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.449824 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-internal-tls-certs\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.449860 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-combined-ca-bundle\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.449879 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-config-data\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.449947 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9153262d-7256-4581-b8d7-bd3575ece8b1-etc-swift\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.450032 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27hk4\" (UniqueName: \"kubernetes.io/projected/9153262d-7256-4581-b8d7-bd3575ece8b1-kube-api-access-27hk4\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.450062 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-public-tls-certs\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.450134 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9153262d-7256-4581-b8d7-bd3575ece8b1-run-httpd\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.551807 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9153262d-7256-4581-b8d7-bd3575ece8b1-run-httpd\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.551913 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9153262d-7256-4581-b8d7-bd3575ece8b1-log-httpd\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.551974 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-internal-tls-certs\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.552018 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-combined-ca-bundle\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.552045 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-config-data\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.552075 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9153262d-7256-4581-b8d7-bd3575ece8b1-etc-swift\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.552118 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27hk4\" (UniqueName: \"kubernetes.io/projected/9153262d-7256-4581-b8d7-bd3575ece8b1-kube-api-access-27hk4\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.552156 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-public-tls-certs\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.552376 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9153262d-7256-4581-b8d7-bd3575ece8b1-run-httpd\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.552491 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9153262d-7256-4581-b8d7-bd3575ece8b1-log-httpd\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.560125 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9153262d-7256-4581-b8d7-bd3575ece8b1-etc-swift\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.560552 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-config-data\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.563260 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-public-tls-certs\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.563331 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-combined-ca-bundle\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.568212 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9153262d-7256-4581-b8d7-bd3575ece8b1-internal-tls-certs\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.590466 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27hk4\" (UniqueName: \"kubernetes.io/projected/9153262d-7256-4581-b8d7-bd3575ece8b1-kube-api-access-27hk4\") pod \"swift-proxy-686dbdfb85-nlj9c\" (UID: \"9153262d-7256-4581-b8d7-bd3575ece8b1\") " pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.700968 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:50 crc kubenswrapper[4990]: I1003 11:17:50.962770 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:51 crc kubenswrapper[4990]: I1003 11:17:51.357603 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-686dbdfb85-nlj9c"] Oct 03 11:17:51 crc kubenswrapper[4990]: I1003 11:17:51.971432 4990 generic.go:334] "Generic (PLEG): container finished" podID="9964f731-6bb0-46e1-879c-2201ad1c280c" containerID="493832d76850fc600c1fd51dab86155f92a10632b35f35e19f56d2b4e774a59c" exitCode=0 Oct 03 11:17:51 crc kubenswrapper[4990]: I1003 11:17:51.971575 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2dmzm" event={"ID":"9964f731-6bb0-46e1-879c-2201ad1c280c","Type":"ContainerDied","Data":"493832d76850fc600c1fd51dab86155f92a10632b35f35e19f56d2b4e774a59c"} Oct 03 11:17:51 crc kubenswrapper[4990]: I1003 11:17:51.973749 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-686dbdfb85-nlj9c" event={"ID":"9153262d-7256-4581-b8d7-bd3575ece8b1","Type":"ContainerStarted","Data":"4922bd2310aa21b5431b268e52a076b8223c04bc6b85a413b23edb453f14eebe"} Oct 03 11:17:51 crc kubenswrapper[4990]: I1003 11:17:51.973781 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-686dbdfb85-nlj9c" event={"ID":"9153262d-7256-4581-b8d7-bd3575ece8b1","Type":"ContainerStarted","Data":"790a422142284e698f3c5624fd057ca343330883abfdcdee331bfbcfa784b950"} Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.013645 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-686dbdfb85-nlj9c" event={"ID":"9153262d-7256-4581-b8d7-bd3575ece8b1","Type":"ContainerStarted","Data":"ade1a8a7790117607f38f6739e16104bc84337bc3403dfb85493aab8698df69d"} Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.015082 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.015176 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.141632 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.166179 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-686dbdfb85-nlj9c" podStartSLOduration=3.166160694 podStartE2EDuration="3.166160694s" podCreationTimestamp="2025-10-03 11:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:17:53.039963638 +0000 UTC m=+5654.836595495" watchObservedRunningTime="2025-10-03 11:17:53.166160694 +0000 UTC m=+5654.962792551" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.474317 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.669600 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-scripts\") pod \"9964f731-6bb0-46e1-879c-2201ad1c280c\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.669720 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9964f731-6bb0-46e1-879c-2201ad1c280c-etc-swift\") pod \"9964f731-6bb0-46e1-879c-2201ad1c280c\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.669773 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-combined-ca-bundle\") pod \"9964f731-6bb0-46e1-879c-2201ad1c280c\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.669810 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-dispersionconf\") pod \"9964f731-6bb0-46e1-879c-2201ad1c280c\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.669866 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-ring-data-devices\") pod \"9964f731-6bb0-46e1-879c-2201ad1c280c\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.669970 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrcg8\" (UniqueName: \"kubernetes.io/projected/9964f731-6bb0-46e1-879c-2201ad1c280c-kube-api-access-xrcg8\") pod \"9964f731-6bb0-46e1-879c-2201ad1c280c\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.670073 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-swiftconf\") pod \"9964f731-6bb0-46e1-879c-2201ad1c280c\" (UID: \"9964f731-6bb0-46e1-879c-2201ad1c280c\") " Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.670348 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9964f731-6bb0-46e1-879c-2201ad1c280c" (UID: "9964f731-6bb0-46e1-879c-2201ad1c280c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.670905 4990 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.671358 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9964f731-6bb0-46e1-879c-2201ad1c280c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9964f731-6bb0-46e1-879c-2201ad1c280c" (UID: "9964f731-6bb0-46e1-879c-2201ad1c280c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.675276 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9964f731-6bb0-46e1-879c-2201ad1c280c-kube-api-access-xrcg8" (OuterVolumeSpecName: "kube-api-access-xrcg8") pod "9964f731-6bb0-46e1-879c-2201ad1c280c" (UID: "9964f731-6bb0-46e1-879c-2201ad1c280c"). InnerVolumeSpecName "kube-api-access-xrcg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.685762 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9964f731-6bb0-46e1-879c-2201ad1c280c" (UID: "9964f731-6bb0-46e1-879c-2201ad1c280c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.697308 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-scripts" (OuterVolumeSpecName: "scripts") pod "9964f731-6bb0-46e1-879c-2201ad1c280c" (UID: "9964f731-6bb0-46e1-879c-2201ad1c280c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.720802 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9964f731-6bb0-46e1-879c-2201ad1c280c" (UID: "9964f731-6bb0-46e1-879c-2201ad1c280c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.722725 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9964f731-6bb0-46e1-879c-2201ad1c280c" (UID: "9964f731-6bb0-46e1-879c-2201ad1c280c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.773042 4990 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9964f731-6bb0-46e1-879c-2201ad1c280c-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.773094 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.773117 4990 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.773135 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrcg8\" (UniqueName: \"kubernetes.io/projected/9964f731-6bb0-46e1-879c-2201ad1c280c-kube-api-access-xrcg8\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.773154 4990 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9964f731-6bb0-46e1-879c-2201ad1c280c-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:53 crc kubenswrapper[4990]: I1003 11:17:53.773170 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9964f731-6bb0-46e1-879c-2201ad1c280c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:54 crc kubenswrapper[4990]: I1003 11:17:54.025677 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2dmzm" Oct 03 11:17:54 crc kubenswrapper[4990]: I1003 11:17:54.025971 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2dmzm" event={"ID":"9964f731-6bb0-46e1-879c-2201ad1c280c","Type":"ContainerDied","Data":"2359db0d1adf218f40d75a72a166214226452d140564d5c9a5884df67a18fd7b"} Oct 03 11:17:54 crc kubenswrapper[4990]: I1003 11:17:54.026003 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2359db0d1adf218f40d75a72a166214226452d140564d5c9a5884df67a18fd7b" Oct 03 11:17:55 crc kubenswrapper[4990]: I1003 11:17:55.304502 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:17:55 crc kubenswrapper[4990]: I1003 11:17:55.305052 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:17:56 crc kubenswrapper[4990]: I1003 11:17:56.055679 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:17:56 crc kubenswrapper[4990]: I1003 11:17:56.107935 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d954d6867-dtm7x"] Oct 03 11:17:56 crc kubenswrapper[4990]: I1003 11:17:56.108192 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" podUID="79caa14e-728c-4592-99f7-6779ece3603c" containerName="dnsmasq-dns" containerID="cri-o://5e97c2eab106ae6256d5c8598ed04045801bf712a1727dcb23afcd60619e343b" gracePeriod=10 Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.055165 4990 generic.go:334] "Generic (PLEG): container finished" podID="79caa14e-728c-4592-99f7-6779ece3603c" containerID="5e97c2eab106ae6256d5c8598ed04045801bf712a1727dcb23afcd60619e343b" exitCode=0 Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.055211 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" event={"ID":"79caa14e-728c-4592-99f7-6779ece3603c","Type":"ContainerDied","Data":"5e97c2eab106ae6256d5c8598ed04045801bf712a1727dcb23afcd60619e343b"} Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.153587 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.240613 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-config\") pod \"79caa14e-728c-4592-99f7-6779ece3603c\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.240794 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-sb\") pod \"79caa14e-728c-4592-99f7-6779ece3603c\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.240849 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4rkg\" (UniqueName: \"kubernetes.io/projected/79caa14e-728c-4592-99f7-6779ece3603c-kube-api-access-b4rkg\") pod \"79caa14e-728c-4592-99f7-6779ece3603c\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.240973 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-dns-svc\") pod \"79caa14e-728c-4592-99f7-6779ece3603c\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.241023 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-nb\") pod \"79caa14e-728c-4592-99f7-6779ece3603c\" (UID: \"79caa14e-728c-4592-99f7-6779ece3603c\") " Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.247756 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79caa14e-728c-4592-99f7-6779ece3603c-kube-api-access-b4rkg" (OuterVolumeSpecName: "kube-api-access-b4rkg") pod "79caa14e-728c-4592-99f7-6779ece3603c" (UID: "79caa14e-728c-4592-99f7-6779ece3603c"). InnerVolumeSpecName "kube-api-access-b4rkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.283043 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79caa14e-728c-4592-99f7-6779ece3603c" (UID: "79caa14e-728c-4592-99f7-6779ece3603c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.287679 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-config" (OuterVolumeSpecName: "config") pod "79caa14e-728c-4592-99f7-6779ece3603c" (UID: "79caa14e-728c-4592-99f7-6779ece3603c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.288702 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79caa14e-728c-4592-99f7-6779ece3603c" (UID: "79caa14e-728c-4592-99f7-6779ece3603c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.305676 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79caa14e-728c-4592-99f7-6779ece3603c" (UID: "79caa14e-728c-4592-99f7-6779ece3603c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.343156 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.343190 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4rkg\" (UniqueName: \"kubernetes.io/projected/79caa14e-728c-4592-99f7-6779ece3603c-kube-api-access-b4rkg\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.343203 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.343215 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:57 crc kubenswrapper[4990]: I1003 11:17:57.343227 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79caa14e-728c-4592-99f7-6779ece3603c-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:17:58 crc kubenswrapper[4990]: I1003 11:17:58.067385 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" event={"ID":"79caa14e-728c-4592-99f7-6779ece3603c","Type":"ContainerDied","Data":"6358c09d6f98bef7ca10b4c7e5541c7e0d03ce4a1385e75b9fef5c264c5a288d"} Oct 03 11:17:58 crc kubenswrapper[4990]: I1003 11:17:58.067479 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" Oct 03 11:17:58 crc kubenswrapper[4990]: I1003 11:17:58.067757 4990 scope.go:117] "RemoveContainer" containerID="5e97c2eab106ae6256d5c8598ed04045801bf712a1727dcb23afcd60619e343b" Oct 03 11:17:58 crc kubenswrapper[4990]: I1003 11:17:58.113625 4990 scope.go:117] "RemoveContainer" containerID="67423d40a7731a7603c1d3faec513b72cae043b6091331d72838320c19244dc2" Oct 03 11:17:58 crc kubenswrapper[4990]: I1003 11:17:58.115643 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d954d6867-dtm7x"] Oct 03 11:17:58 crc kubenswrapper[4990]: I1003 11:17:58.123697 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d954d6867-dtm7x"] Oct 03 11:17:58 crc kubenswrapper[4990]: I1003 11:17:58.139718 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:17:58 crc kubenswrapper[4990]: I1003 11:17:58.883919 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79caa14e-728c-4592-99f7-6779ece3603c" path="/var/lib/kubelet/pods/79caa14e-728c-4592-99f7-6779ece3603c/volumes" Oct 03 11:18:00 crc kubenswrapper[4990]: I1003 11:18:00.709992 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:18:00 crc kubenswrapper[4990]: I1003 11:18:00.710951 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-686dbdfb85-nlj9c" Oct 03 11:18:00 crc kubenswrapper[4990]: I1003 11:18:00.812603 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-769fc986df-85l96"] Oct 03 11:18:00 crc kubenswrapper[4990]: I1003 11:18:00.812842 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-769fc986df-85l96" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerName="proxy-httpd" containerID="cri-o://5d3c01c3ba4ca3022baf02dfe36b6b6892ad427f93b6eb6ac8ae326fcb77f138" gracePeriod=30 Oct 03 11:18:00 crc kubenswrapper[4990]: I1003 11:18:00.813200 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-769fc986df-85l96" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerName="proxy-server" containerID="cri-o://f4c49d631ad99ff2a16fb44ff277c1265bdbe18f99005914a387d10dcbeb9c4a" gracePeriod=30 Oct 03 11:18:01 crc kubenswrapper[4990]: I1003 11:18:01.097330 4990 generic.go:334] "Generic (PLEG): container finished" podID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerID="5d3c01c3ba4ca3022baf02dfe36b6b6892ad427f93b6eb6ac8ae326fcb77f138" exitCode=0 Oct 03 11:18:01 crc kubenswrapper[4990]: I1003 11:18:01.097414 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-769fc986df-85l96" event={"ID":"f8682c98-30ae-4582-93f3-7f5629760e0a","Type":"ContainerDied","Data":"5d3c01c3ba4ca3022baf02dfe36b6b6892ad427f93b6eb6ac8ae326fcb77f138"} Oct 03 11:18:01 crc kubenswrapper[4990]: E1003 11:18:01.376998 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8682c98_30ae_4582_93f3_7f5629760e0a.slice/crio-f4c49d631ad99ff2a16fb44ff277c1265bdbe18f99005914a387d10dcbeb9c4a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8682c98_30ae_4582_93f3_7f5629760e0a.slice/crio-conmon-f4c49d631ad99ff2a16fb44ff277c1265bdbe18f99005914a387d10dcbeb9c4a.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:18:01 crc kubenswrapper[4990]: I1003 11:18:01.954239 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d954d6867-dtm7x" podUID="79caa14e-728c-4592-99f7-6779ece3603c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.43:5353: i/o timeout" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.108901 4990 generic.go:334] "Generic (PLEG): container finished" podID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerID="f4c49d631ad99ff2a16fb44ff277c1265bdbe18f99005914a387d10dcbeb9c4a" exitCode=0 Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.108950 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-769fc986df-85l96" event={"ID":"f8682c98-30ae-4582-93f3-7f5629760e0a","Type":"ContainerDied","Data":"f4c49d631ad99ff2a16fb44ff277c1265bdbe18f99005914a387d10dcbeb9c4a"} Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.363026 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.453131 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gk25\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-kube-api-access-2gk25\") pod \"f8682c98-30ae-4582-93f3-7f5629760e0a\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.453178 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-etc-swift\") pod \"f8682c98-30ae-4582-93f3-7f5629760e0a\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.453211 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-log-httpd\") pod \"f8682c98-30ae-4582-93f3-7f5629760e0a\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.453235 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-run-httpd\") pod \"f8682c98-30ae-4582-93f3-7f5629760e0a\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.453292 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-config-data\") pod \"f8682c98-30ae-4582-93f3-7f5629760e0a\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.453941 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f8682c98-30ae-4582-93f3-7f5629760e0a" (UID: "f8682c98-30ae-4582-93f3-7f5629760e0a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.454350 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f8682c98-30ae-4582-93f3-7f5629760e0a" (UID: "f8682c98-30ae-4582-93f3-7f5629760e0a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.454486 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-combined-ca-bundle\") pod \"f8682c98-30ae-4582-93f3-7f5629760e0a\" (UID: \"f8682c98-30ae-4582-93f3-7f5629760e0a\") " Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.455391 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.455417 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8682c98-30ae-4582-93f3-7f5629760e0a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.459083 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-kube-api-access-2gk25" (OuterVolumeSpecName: "kube-api-access-2gk25") pod "f8682c98-30ae-4582-93f3-7f5629760e0a" (UID: "f8682c98-30ae-4582-93f3-7f5629760e0a"). InnerVolumeSpecName "kube-api-access-2gk25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.460373 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f8682c98-30ae-4582-93f3-7f5629760e0a" (UID: "f8682c98-30ae-4582-93f3-7f5629760e0a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.532538 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8682c98-30ae-4582-93f3-7f5629760e0a" (UID: "f8682c98-30ae-4582-93f3-7f5629760e0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.557191 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gk25\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-kube-api-access-2gk25\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.557227 4990 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f8682c98-30ae-4582-93f3-7f5629760e0a-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.557241 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.557323 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-config-data" (OuterVolumeSpecName: "config-data") pod "f8682c98-30ae-4582-93f3-7f5629760e0a" (UID: "f8682c98-30ae-4582-93f3-7f5629760e0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:02 crc kubenswrapper[4990]: I1003 11:18:02.659492 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8682c98-30ae-4582-93f3-7f5629760e0a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:03 crc kubenswrapper[4990]: I1003 11:18:03.124143 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-769fc986df-85l96" event={"ID":"f8682c98-30ae-4582-93f3-7f5629760e0a","Type":"ContainerDied","Data":"4f68aa82f7085b856784ac7063dec936393f72bd7044d4f70d08ac06ef089a55"} Oct 03 11:18:03 crc kubenswrapper[4990]: I1003 11:18:03.124654 4990 scope.go:117] "RemoveContainer" containerID="f4c49d631ad99ff2a16fb44ff277c1265bdbe18f99005914a387d10dcbeb9c4a" Oct 03 11:18:03 crc kubenswrapper[4990]: I1003 11:18:03.124210 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-769fc986df-85l96" Oct 03 11:18:03 crc kubenswrapper[4990]: I1003 11:18:03.164874 4990 scope.go:117] "RemoveContainer" containerID="5d3c01c3ba4ca3022baf02dfe36b6b6892ad427f93b6eb6ac8ae326fcb77f138" Oct 03 11:18:03 crc kubenswrapper[4990]: I1003 11:18:03.167871 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-769fc986df-85l96"] Oct 03 11:18:03 crc kubenswrapper[4990]: I1003 11:18:03.182862 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-769fc986df-85l96"] Oct 03 11:18:04 crc kubenswrapper[4990]: I1003 11:18:04.887491 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" path="/var/lib/kubelet/pods/f8682c98-30ae-4582-93f3-7f5629760e0a/volumes" Oct 03 11:18:25 crc kubenswrapper[4990]: I1003 11:18:25.304418 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:18:25 crc kubenswrapper[4990]: I1003 11:18:25.305040 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.509423 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-wxppx"] Oct 03 11:18:33 crc kubenswrapper[4990]: E1003 11:18:33.510582 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79caa14e-728c-4592-99f7-6779ece3603c" containerName="init" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.510603 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="79caa14e-728c-4592-99f7-6779ece3603c" containerName="init" Oct 03 11:18:33 crc kubenswrapper[4990]: E1003 11:18:33.510633 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9964f731-6bb0-46e1-879c-2201ad1c280c" containerName="swift-ring-rebalance" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.510645 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9964f731-6bb0-46e1-879c-2201ad1c280c" containerName="swift-ring-rebalance" Oct 03 11:18:33 crc kubenswrapper[4990]: E1003 11:18:33.510670 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79caa14e-728c-4592-99f7-6779ece3603c" containerName="dnsmasq-dns" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.510681 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="79caa14e-728c-4592-99f7-6779ece3603c" containerName="dnsmasq-dns" Oct 03 11:18:33 crc kubenswrapper[4990]: E1003 11:18:33.510706 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerName="proxy-server" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.510717 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerName="proxy-server" Oct 03 11:18:33 crc kubenswrapper[4990]: E1003 11:18:33.510735 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerName="proxy-httpd" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.510745 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerName="proxy-httpd" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.511031 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="79caa14e-728c-4592-99f7-6779ece3603c" containerName="dnsmasq-dns" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.511052 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerName="proxy-httpd" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.511066 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9964f731-6bb0-46e1-879c-2201ad1c280c" containerName="swift-ring-rebalance" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.511096 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8682c98-30ae-4582-93f3-7f5629760e0a" containerName="proxy-server" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.512023 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wxppx" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.519859 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wxppx"] Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.685470 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4msk\" (UniqueName: \"kubernetes.io/projected/7dd54627-c7cf-4726-b0a4-fcca4f641a25-kube-api-access-z4msk\") pod \"cinder-db-create-wxppx\" (UID: \"7dd54627-c7cf-4726-b0a4-fcca4f641a25\") " pod="openstack/cinder-db-create-wxppx" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.787367 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4msk\" (UniqueName: \"kubernetes.io/projected/7dd54627-c7cf-4726-b0a4-fcca4f641a25-kube-api-access-z4msk\") pod \"cinder-db-create-wxppx\" (UID: \"7dd54627-c7cf-4726-b0a4-fcca4f641a25\") " pod="openstack/cinder-db-create-wxppx" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.805647 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4msk\" (UniqueName: \"kubernetes.io/projected/7dd54627-c7cf-4726-b0a4-fcca4f641a25-kube-api-access-z4msk\") pod \"cinder-db-create-wxppx\" (UID: \"7dd54627-c7cf-4726-b0a4-fcca4f641a25\") " pod="openstack/cinder-db-create-wxppx" Oct 03 11:18:33 crc kubenswrapper[4990]: I1003 11:18:33.831829 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wxppx" Oct 03 11:18:34 crc kubenswrapper[4990]: I1003 11:18:34.331669 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wxppx"] Oct 03 11:18:34 crc kubenswrapper[4990]: I1003 11:18:34.443335 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wxppx" event={"ID":"7dd54627-c7cf-4726-b0a4-fcca4f641a25","Type":"ContainerStarted","Data":"85623613b6d5f5237ca39e966f02791e3255ce98ab7d480fee66065aacc7812c"} Oct 03 11:18:35 crc kubenswrapper[4990]: I1003 11:18:35.454489 4990 generic.go:334] "Generic (PLEG): container finished" podID="7dd54627-c7cf-4726-b0a4-fcca4f641a25" containerID="11e77314503db0ca56eb0c2a64e433c770a9a2765dad2e037e3bcba34985d5c9" exitCode=0 Oct 03 11:18:35 crc kubenswrapper[4990]: I1003 11:18:35.454668 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wxppx" event={"ID":"7dd54627-c7cf-4726-b0a4-fcca4f641a25","Type":"ContainerDied","Data":"11e77314503db0ca56eb0c2a64e433c770a9a2765dad2e037e3bcba34985d5c9"} Oct 03 11:18:36 crc kubenswrapper[4990]: I1003 11:18:36.781488 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wxppx" Oct 03 11:18:36 crc kubenswrapper[4990]: I1003 11:18:36.951531 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4msk\" (UniqueName: \"kubernetes.io/projected/7dd54627-c7cf-4726-b0a4-fcca4f641a25-kube-api-access-z4msk\") pod \"7dd54627-c7cf-4726-b0a4-fcca4f641a25\" (UID: \"7dd54627-c7cf-4726-b0a4-fcca4f641a25\") " Oct 03 11:18:36 crc kubenswrapper[4990]: I1003 11:18:36.956045 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd54627-c7cf-4726-b0a4-fcca4f641a25-kube-api-access-z4msk" (OuterVolumeSpecName: "kube-api-access-z4msk") pod "7dd54627-c7cf-4726-b0a4-fcca4f641a25" (UID: "7dd54627-c7cf-4726-b0a4-fcca4f641a25"). InnerVolumeSpecName "kube-api-access-z4msk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:18:37 crc kubenswrapper[4990]: I1003 11:18:37.053311 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4msk\" (UniqueName: \"kubernetes.io/projected/7dd54627-c7cf-4726-b0a4-fcca4f641a25-kube-api-access-z4msk\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:37 crc kubenswrapper[4990]: I1003 11:18:37.473528 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wxppx" event={"ID":"7dd54627-c7cf-4726-b0a4-fcca4f641a25","Type":"ContainerDied","Data":"85623613b6d5f5237ca39e966f02791e3255ce98ab7d480fee66065aacc7812c"} Oct 03 11:18:37 crc kubenswrapper[4990]: I1003 11:18:37.473580 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85623613b6d5f5237ca39e966f02791e3255ce98ab7d480fee66065aacc7812c" Oct 03 11:18:37 crc kubenswrapper[4990]: I1003 11:18:37.473652 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wxppx" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.562668 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f979-account-create-zgdsc"] Oct 03 11:18:43 crc kubenswrapper[4990]: E1003 11:18:43.563655 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd54627-c7cf-4726-b0a4-fcca4f641a25" containerName="mariadb-database-create" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.563672 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd54627-c7cf-4726-b0a4-fcca4f641a25" containerName="mariadb-database-create" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.563848 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd54627-c7cf-4726-b0a4-fcca4f641a25" containerName="mariadb-database-create" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.564704 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f979-account-create-zgdsc" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.568867 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.583817 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f979-account-create-zgdsc"] Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.683405 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrc8t\" (UniqueName: \"kubernetes.io/projected/69a6af39-f0fb-4c78-9119-8a0576429a95-kube-api-access-wrc8t\") pod \"cinder-f979-account-create-zgdsc\" (UID: \"69a6af39-f0fb-4c78-9119-8a0576429a95\") " pod="openstack/cinder-f979-account-create-zgdsc" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.785186 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrc8t\" (UniqueName: \"kubernetes.io/projected/69a6af39-f0fb-4c78-9119-8a0576429a95-kube-api-access-wrc8t\") pod \"cinder-f979-account-create-zgdsc\" (UID: \"69a6af39-f0fb-4c78-9119-8a0576429a95\") " pod="openstack/cinder-f979-account-create-zgdsc" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.806670 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrc8t\" (UniqueName: \"kubernetes.io/projected/69a6af39-f0fb-4c78-9119-8a0576429a95-kube-api-access-wrc8t\") pod \"cinder-f979-account-create-zgdsc\" (UID: \"69a6af39-f0fb-4c78-9119-8a0576429a95\") " pod="openstack/cinder-f979-account-create-zgdsc" Oct 03 11:18:43 crc kubenswrapper[4990]: I1003 11:18:43.890000 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f979-account-create-zgdsc" Oct 03 11:18:44 crc kubenswrapper[4990]: I1003 11:18:44.350880 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f979-account-create-zgdsc"] Oct 03 11:18:44 crc kubenswrapper[4990]: I1003 11:18:44.538933 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f979-account-create-zgdsc" event={"ID":"69a6af39-f0fb-4c78-9119-8a0576429a95","Type":"ContainerStarted","Data":"acac4805b355bbecc9c23cf008908d297bd182160c9e14673c2615734d5019a9"} Oct 03 11:18:45 crc kubenswrapper[4990]: I1003 11:18:45.549831 4990 generic.go:334] "Generic (PLEG): container finished" podID="69a6af39-f0fb-4c78-9119-8a0576429a95" containerID="8081059dffbadd1ac4461b04d6fa5f91d310b566dd296a8ae9d0771885fd62e8" exitCode=0 Oct 03 11:18:45 crc kubenswrapper[4990]: I1003 11:18:45.549901 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f979-account-create-zgdsc" event={"ID":"69a6af39-f0fb-4c78-9119-8a0576429a95","Type":"ContainerDied","Data":"8081059dffbadd1ac4461b04d6fa5f91d310b566dd296a8ae9d0771885fd62e8"} Oct 03 11:18:46 crc kubenswrapper[4990]: I1003 11:18:46.949080 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f979-account-create-zgdsc" Oct 03 11:18:46 crc kubenswrapper[4990]: I1003 11:18:46.971539 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrc8t\" (UniqueName: \"kubernetes.io/projected/69a6af39-f0fb-4c78-9119-8a0576429a95-kube-api-access-wrc8t\") pod \"69a6af39-f0fb-4c78-9119-8a0576429a95\" (UID: \"69a6af39-f0fb-4c78-9119-8a0576429a95\") " Oct 03 11:18:46 crc kubenswrapper[4990]: I1003 11:18:46.993134 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a6af39-f0fb-4c78-9119-8a0576429a95-kube-api-access-wrc8t" (OuterVolumeSpecName: "kube-api-access-wrc8t") pod "69a6af39-f0fb-4c78-9119-8a0576429a95" (UID: "69a6af39-f0fb-4c78-9119-8a0576429a95"). InnerVolumeSpecName "kube-api-access-wrc8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:18:47 crc kubenswrapper[4990]: I1003 11:18:47.074400 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrc8t\" (UniqueName: \"kubernetes.io/projected/69a6af39-f0fb-4c78-9119-8a0576429a95-kube-api-access-wrc8t\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:47 crc kubenswrapper[4990]: I1003 11:18:47.571979 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f979-account-create-zgdsc" event={"ID":"69a6af39-f0fb-4c78-9119-8a0576429a95","Type":"ContainerDied","Data":"acac4805b355bbecc9c23cf008908d297bd182160c9e14673c2615734d5019a9"} Oct 03 11:18:47 crc kubenswrapper[4990]: I1003 11:18:47.572033 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acac4805b355bbecc9c23cf008908d297bd182160c9e14673c2615734d5019a9" Oct 03 11:18:47 crc kubenswrapper[4990]: I1003 11:18:47.572054 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f979-account-create-zgdsc" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.521870 4990 scope.go:117] "RemoveContainer" containerID="c4e2610ba2eb582e91d1cbad6ef217d3f68c47d112529a1e2c78a8e225042006" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.710507 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jzsvg"] Oct 03 11:18:48 crc kubenswrapper[4990]: E1003 11:18:48.711324 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a6af39-f0fb-4c78-9119-8a0576429a95" containerName="mariadb-account-create" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.711367 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a6af39-f0fb-4c78-9119-8a0576429a95" containerName="mariadb-account-create" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.711742 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a6af39-f0fb-4c78-9119-8a0576429a95" containerName="mariadb-account-create" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.712813 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.716095 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.716301 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.716646 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n8xr5" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.718973 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jzsvg"] Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.909416 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-db-sync-config-data\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.909489 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-scripts\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.909578 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-etc-machine-id\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.909697 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-config-data\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.909842 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-combined-ca-bundle\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:48 crc kubenswrapper[4990]: I1003 11:18:48.909954 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn6sc\" (UniqueName: \"kubernetes.io/projected/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-kube-api-access-hn6sc\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.011316 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-etc-machine-id\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.011419 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-config-data\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.011481 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-combined-ca-bundle\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.011624 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn6sc\" (UniqueName: \"kubernetes.io/projected/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-kube-api-access-hn6sc\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.011709 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-db-sync-config-data\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.011809 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-scripts\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.012020 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-etc-machine-id\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.018015 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-db-sync-config-data\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.018420 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-combined-ca-bundle\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.019039 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-config-data\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.022895 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-scripts\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.043465 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn6sc\" (UniqueName: \"kubernetes.io/projected/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-kube-api-access-hn6sc\") pod \"cinder-db-sync-jzsvg\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.333369 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:49 crc kubenswrapper[4990]: I1003 11:18:49.766278 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jzsvg"] Oct 03 11:18:50 crc kubenswrapper[4990]: I1003 11:18:50.605196 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jzsvg" event={"ID":"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520","Type":"ContainerStarted","Data":"eac84ab12b81e3fe136aac344cec54856586efb28d93c8e40eb507533c325e79"} Oct 03 11:18:50 crc kubenswrapper[4990]: I1003 11:18:50.605558 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jzsvg" event={"ID":"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520","Type":"ContainerStarted","Data":"bc7e030aa0694a85a90912b3187760073e51483ba9d6021e44962d508dfaf8ca"} Oct 03 11:18:50 crc kubenswrapper[4990]: I1003 11:18:50.621996 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jzsvg" podStartSLOduration=2.621977052 podStartE2EDuration="2.621977052s" podCreationTimestamp="2025-10-03 11:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:18:50.619883168 +0000 UTC m=+5712.416515045" watchObservedRunningTime="2025-10-03 11:18:50.621977052 +0000 UTC m=+5712.418608899" Oct 03 11:18:53 crc kubenswrapper[4990]: I1003 11:18:53.632961 4990 generic.go:334] "Generic (PLEG): container finished" podID="6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" containerID="eac84ab12b81e3fe136aac344cec54856586efb28d93c8e40eb507533c325e79" exitCode=0 Oct 03 11:18:53 crc kubenswrapper[4990]: I1003 11:18:53.633042 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jzsvg" event={"ID":"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520","Type":"ContainerDied","Data":"eac84ab12b81e3fe136aac344cec54856586efb28d93c8e40eb507533c325e79"} Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.035602 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.167099 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-combined-ca-bundle\") pod \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.167489 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-scripts\") pod \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.167572 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-db-sync-config-data\") pod \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.167589 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-etc-machine-id\") pod \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.167604 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-config-data\") pod \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.167662 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn6sc\" (UniqueName: \"kubernetes.io/projected/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-kube-api-access-hn6sc\") pod \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\" (UID: \"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520\") " Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.167811 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" (UID: "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.168252 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.172834 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" (UID: "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.173187 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-scripts" (OuterVolumeSpecName: "scripts") pod "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" (UID: "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.180617 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-kube-api-access-hn6sc" (OuterVolumeSpecName: "kube-api-access-hn6sc") pod "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" (UID: "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520"). InnerVolumeSpecName "kube-api-access-hn6sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.203026 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" (UID: "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.214631 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-config-data" (OuterVolumeSpecName: "config-data") pod "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" (UID: "6893b31d-52e8-4ca8-b8eb-9dd0cff0b520"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.269786 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.269819 4990 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.269835 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.269847 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn6sc\" (UniqueName: \"kubernetes.io/projected/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-kube-api-access-hn6sc\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.269859 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.304311 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.304383 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.304436 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.305145 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.305198 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" gracePeriod=600 Oct 03 11:18:55 crc kubenswrapper[4990]: E1003 11:18:55.495110 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.656286 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" exitCode=0 Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.656331 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16"} Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.656786 4990 scope.go:117] "RemoveContainer" containerID="45f44d1104d8a5f414e2f55b32a55a6324ac5627c3c486e3590389dd674e36eb" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.657444 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:18:55 crc kubenswrapper[4990]: E1003 11:18:55.658721 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.659590 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jzsvg" event={"ID":"6893b31d-52e8-4ca8-b8eb-9dd0cff0b520","Type":"ContainerDied","Data":"bc7e030aa0694a85a90912b3187760073e51483ba9d6021e44962d508dfaf8ca"} Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.659619 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7e030aa0694a85a90912b3187760073e51483ba9d6021e44962d508dfaf8ca" Oct 03 11:18:55 crc kubenswrapper[4990]: I1003 11:18:55.659905 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jzsvg" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.007747 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ddb76bc4c-gczgf"] Oct 03 11:18:56 crc kubenswrapper[4990]: E1003 11:18:56.009889 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" containerName="cinder-db-sync" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.010779 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" containerName="cinder-db-sync" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.011208 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" containerName="cinder-db-sync" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.012832 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.028097 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ddb76bc4c-gczgf"] Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.160775 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.162036 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.164942 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.165178 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n8xr5" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.165337 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.165477 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.174879 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.185190 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pphql\" (UniqueName: \"kubernetes.io/projected/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-kube-api-access-pphql\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.185285 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-dns-svc\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.185358 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-nb\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.185432 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-config\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.185460 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-sb\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.287274 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-nb\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.287581 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466430e5-4fa3-406b-8d17-bb196e3a46cc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.287683 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc2vw\" (UniqueName: \"kubernetes.io/projected/466430e5-4fa3-406b-8d17-bb196e3a46cc-kube-api-access-zc2vw\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.287792 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data-custom\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.287878 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466430e5-4fa3-406b-8d17-bb196e3a46cc-logs\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.287962 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-config\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.288085 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-sb\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.288193 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.288266 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.288357 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pphql\" (UniqueName: \"kubernetes.io/projected/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-kube-api-access-pphql\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.288434 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-scripts\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.288549 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-dns-svc\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.288748 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-sb\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.288899 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-config\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.289446 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-nb\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.289569 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-dns-svc\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.308313 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pphql\" (UniqueName: \"kubernetes.io/projected/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-kube-api-access-pphql\") pod \"dnsmasq-dns-7ddb76bc4c-gczgf\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.331639 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.390777 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data-custom\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.390842 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466430e5-4fa3-406b-8d17-bb196e3a46cc-logs\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.390908 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.390948 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.390987 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-scripts\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.391076 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466430e5-4fa3-406b-8d17-bb196e3a46cc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.391109 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc2vw\" (UniqueName: \"kubernetes.io/projected/466430e5-4fa3-406b-8d17-bb196e3a46cc-kube-api-access-zc2vw\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.392995 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466430e5-4fa3-406b-8d17-bb196e3a46cc-logs\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.393132 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466430e5-4fa3-406b-8d17-bb196e3a46cc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.396390 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-scripts\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.396601 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data-custom\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.397683 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.398959 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.409672 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc2vw\" (UniqueName: \"kubernetes.io/projected/466430e5-4fa3-406b-8d17-bb196e3a46cc-kube-api-access-zc2vw\") pod \"cinder-api-0\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.475873 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.915348 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ddb76bc4c-gczgf"] Oct 03 11:18:56 crc kubenswrapper[4990]: W1003 11:18:56.922381 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod466430e5_4fa3_406b_8d17_bb196e3a46cc.slice/crio-328ace8aa1b65761ce017dda8bd4e24f2e58d738a5f992c73f8b8b2a3888febe WatchSource:0}: Error finding container 328ace8aa1b65761ce017dda8bd4e24f2e58d738a5f992c73f8b8b2a3888febe: Status 404 returned error can't find the container with id 328ace8aa1b65761ce017dda8bd4e24f2e58d738a5f992c73f8b8b2a3888febe Oct 03 11:18:56 crc kubenswrapper[4990]: I1003 11:18:56.928357 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:18:57 crc kubenswrapper[4990]: I1003 11:18:57.697361 4990 generic.go:334] "Generic (PLEG): container finished" podID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" containerID="c230189993db54c1a98be05ee508a401ad3959b63b83568d61ba01e914c2d667" exitCode=0 Oct 03 11:18:57 crc kubenswrapper[4990]: I1003 11:18:57.697404 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" event={"ID":"670a7ccb-4dec-44e3-aaea-53cb7b576bdb","Type":"ContainerDied","Data":"c230189993db54c1a98be05ee508a401ad3959b63b83568d61ba01e914c2d667"} Oct 03 11:18:57 crc kubenswrapper[4990]: I1003 11:18:57.697864 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" event={"ID":"670a7ccb-4dec-44e3-aaea-53cb7b576bdb","Type":"ContainerStarted","Data":"a58c822c4802e6ef1d0aba2ffed7b9fe22450d445e0a7ce628a99a636063af02"} Oct 03 11:18:57 crc kubenswrapper[4990]: I1003 11:18:57.707335 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466430e5-4fa3-406b-8d17-bb196e3a46cc","Type":"ContainerStarted","Data":"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8"} Oct 03 11:18:57 crc kubenswrapper[4990]: I1003 11:18:57.707402 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466430e5-4fa3-406b-8d17-bb196e3a46cc","Type":"ContainerStarted","Data":"328ace8aa1b65761ce017dda8bd4e24f2e58d738a5f992c73f8b8b2a3888febe"} Oct 03 11:18:57 crc kubenswrapper[4990]: I1003 11:18:57.836553 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:18:58 crc kubenswrapper[4990]: I1003 11:18:58.717134 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" event={"ID":"670a7ccb-4dec-44e3-aaea-53cb7b576bdb","Type":"ContainerStarted","Data":"f7445bc5a4ec5736c58bc494822424840d2afc5903fb6db097b2521ff9b8224c"} Oct 03 11:18:58 crc kubenswrapper[4990]: I1003 11:18:58.717566 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:18:58 crc kubenswrapper[4990]: I1003 11:18:58.719511 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466430e5-4fa3-406b-8d17-bb196e3a46cc","Type":"ContainerStarted","Data":"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654"} Oct 03 11:18:58 crc kubenswrapper[4990]: I1003 11:18:58.719713 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerName="cinder-api" containerID="cri-o://4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654" gracePeriod=30 Oct 03 11:18:58 crc kubenswrapper[4990]: I1003 11:18:58.719735 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 11:18:58 crc kubenswrapper[4990]: I1003 11:18:58.719688 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerName="cinder-api-log" containerID="cri-o://1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8" gracePeriod=30 Oct 03 11:18:58 crc kubenswrapper[4990]: I1003 11:18:58.742920 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" podStartSLOduration=3.742905722 podStartE2EDuration="3.742905722s" podCreationTimestamp="2025-10-03 11:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:18:58.73774749 +0000 UTC m=+5720.534379347" watchObservedRunningTime="2025-10-03 11:18:58.742905722 +0000 UTC m=+5720.539537569" Oct 03 11:18:58 crc kubenswrapper[4990]: I1003 11:18:58.762931 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.7629045249999997 podStartE2EDuration="2.762904525s" podCreationTimestamp="2025-10-03 11:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:18:58.758503712 +0000 UTC m=+5720.555135579" watchObservedRunningTime="2025-10-03 11:18:58.762904525 +0000 UTC m=+5720.559536402" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.346530 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.477244 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc2vw\" (UniqueName: \"kubernetes.io/projected/466430e5-4fa3-406b-8d17-bb196e3a46cc-kube-api-access-zc2vw\") pod \"466430e5-4fa3-406b-8d17-bb196e3a46cc\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.477361 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data\") pod \"466430e5-4fa3-406b-8d17-bb196e3a46cc\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.477380 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-scripts\") pod \"466430e5-4fa3-406b-8d17-bb196e3a46cc\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.477456 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-combined-ca-bundle\") pod \"466430e5-4fa3-406b-8d17-bb196e3a46cc\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.477572 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data-custom\") pod \"466430e5-4fa3-406b-8d17-bb196e3a46cc\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.477605 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466430e5-4fa3-406b-8d17-bb196e3a46cc-etc-machine-id\") pod \"466430e5-4fa3-406b-8d17-bb196e3a46cc\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.477808 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/466430e5-4fa3-406b-8d17-bb196e3a46cc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "466430e5-4fa3-406b-8d17-bb196e3a46cc" (UID: "466430e5-4fa3-406b-8d17-bb196e3a46cc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.478084 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466430e5-4fa3-406b-8d17-bb196e3a46cc-logs\") pod \"466430e5-4fa3-406b-8d17-bb196e3a46cc\" (UID: \"466430e5-4fa3-406b-8d17-bb196e3a46cc\") " Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.478371 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/466430e5-4fa3-406b-8d17-bb196e3a46cc-logs" (OuterVolumeSpecName: "logs") pod "466430e5-4fa3-406b-8d17-bb196e3a46cc" (UID: "466430e5-4fa3-406b-8d17-bb196e3a46cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.478765 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466430e5-4fa3-406b-8d17-bb196e3a46cc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.478783 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466430e5-4fa3-406b-8d17-bb196e3a46cc-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.482467 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/466430e5-4fa3-406b-8d17-bb196e3a46cc-kube-api-access-zc2vw" (OuterVolumeSpecName: "kube-api-access-zc2vw") pod "466430e5-4fa3-406b-8d17-bb196e3a46cc" (UID: "466430e5-4fa3-406b-8d17-bb196e3a46cc"). InnerVolumeSpecName "kube-api-access-zc2vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.482765 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "466430e5-4fa3-406b-8d17-bb196e3a46cc" (UID: "466430e5-4fa3-406b-8d17-bb196e3a46cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.486483 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-scripts" (OuterVolumeSpecName: "scripts") pod "466430e5-4fa3-406b-8d17-bb196e3a46cc" (UID: "466430e5-4fa3-406b-8d17-bb196e3a46cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.505676 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "466430e5-4fa3-406b-8d17-bb196e3a46cc" (UID: "466430e5-4fa3-406b-8d17-bb196e3a46cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.540182 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data" (OuterVolumeSpecName: "config-data") pod "466430e5-4fa3-406b-8d17-bb196e3a46cc" (UID: "466430e5-4fa3-406b-8d17-bb196e3a46cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.581013 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.581056 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.581072 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc2vw\" (UniqueName: \"kubernetes.io/projected/466430e5-4fa3-406b-8d17-bb196e3a46cc-kube-api-access-zc2vw\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.581086 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.581099 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466430e5-4fa3-406b-8d17-bb196e3a46cc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.730832 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.730886 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466430e5-4fa3-406b-8d17-bb196e3a46cc","Type":"ContainerDied","Data":"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654"} Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.730950 4990 scope.go:117] "RemoveContainer" containerID="4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.730823 4990 generic.go:334] "Generic (PLEG): container finished" podID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerID="4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654" exitCode=0 Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.731145 4990 generic.go:334] "Generic (PLEG): container finished" podID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerID="1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8" exitCode=143 Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.731295 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466430e5-4fa3-406b-8d17-bb196e3a46cc","Type":"ContainerDied","Data":"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8"} Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.731364 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466430e5-4fa3-406b-8d17-bb196e3a46cc","Type":"ContainerDied","Data":"328ace8aa1b65761ce017dda8bd4e24f2e58d738a5f992c73f8b8b2a3888febe"} Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.754580 4990 scope.go:117] "RemoveContainer" containerID="1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.769439 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.778536 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.796440 4990 scope.go:117] "RemoveContainer" containerID="4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654" Oct 03 11:18:59 crc kubenswrapper[4990]: E1003 11:18:59.799780 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654\": container with ID starting with 4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654 not found: ID does not exist" containerID="4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.799838 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654"} err="failed to get container status \"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654\": rpc error: code = NotFound desc = could not find container \"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654\": container with ID starting with 4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654 not found: ID does not exist" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.799874 4990 scope.go:117] "RemoveContainer" containerID="1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8" Oct 03 11:18:59 crc kubenswrapper[4990]: E1003 11:18:59.800264 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8\": container with ID starting with 1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8 not found: ID does not exist" containerID="1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.800297 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8"} err="failed to get container status \"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8\": rpc error: code = NotFound desc = could not find container \"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8\": container with ID starting with 1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8 not found: ID does not exist" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.800319 4990 scope.go:117] "RemoveContainer" containerID="4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.800765 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654"} err="failed to get container status \"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654\": rpc error: code = NotFound desc = could not find container \"4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654\": container with ID starting with 4d11336b32aa3143c6231b8f334f1356a2e577da5203a9bf5c6a9c481d46c654 not found: ID does not exist" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.800803 4990 scope.go:117] "RemoveContainer" containerID="1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.802147 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8"} err="failed to get container status \"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8\": rpc error: code = NotFound desc = could not find container \"1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8\": container with ID starting with 1196b9d3b7f9335576e6352593f77dc22e392f5d8a3b1da5fe22671f643bd8a8 not found: ID does not exist" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.810551 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:18:59 crc kubenswrapper[4990]: E1003 11:18:59.811132 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerName="cinder-api-log" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.811164 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerName="cinder-api-log" Oct 03 11:18:59 crc kubenswrapper[4990]: E1003 11:18:59.811194 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerName="cinder-api" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.811208 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerName="cinder-api" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.811605 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerName="cinder-api" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.811636 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" containerName="cinder-api-log" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.813211 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.821848 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.822025 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.822117 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.822253 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.822391 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.822489 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n8xr5" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.822689 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.886797 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.886864 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0027a1-059e-49c9-91f3-0781aafd3c1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.887044 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.887099 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.887214 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cbr8\" (UniqueName: \"kubernetes.io/projected/6a0027a1-059e-49c9-91f3-0781aafd3c1e-kube-api-access-6cbr8\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.887300 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.887408 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-scripts\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.887458 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0027a1-059e-49c9-91f3-0781aafd3c1e-logs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.887581 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.990012 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-scripts\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.990183 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0027a1-059e-49c9-91f3-0781aafd3c1e-logs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.991030 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0027a1-059e-49c9-91f3-0781aafd3c1e-logs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.991434 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.991689 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.991742 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0027a1-059e-49c9-91f3-0781aafd3c1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.991886 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0027a1-059e-49c9-91f3-0781aafd3c1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.991928 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.991978 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.992130 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cbr8\" (UniqueName: \"kubernetes.io/projected/6a0027a1-059e-49c9-91f3-0781aafd3c1e-kube-api-access-6cbr8\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.992263 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.996137 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-scripts\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.997130 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:18:59 crc kubenswrapper[4990]: I1003 11:18:59.998236 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:19:00 crc kubenswrapper[4990]: I1003 11:18:59.999977 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:19:00 crc kubenswrapper[4990]: I1003 11:19:00.000601 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:19:00 crc kubenswrapper[4990]: I1003 11:19:00.007886 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:19:00 crc kubenswrapper[4990]: I1003 11:19:00.020402 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cbr8\" (UniqueName: \"kubernetes.io/projected/6a0027a1-059e-49c9-91f3-0781aafd3c1e-kube-api-access-6cbr8\") pod \"cinder-api-0\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " pod="openstack/cinder-api-0" Oct 03 11:19:00 crc kubenswrapper[4990]: I1003 11:19:00.145248 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:19:00 crc kubenswrapper[4990]: I1003 11:19:00.656636 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:19:00 crc kubenswrapper[4990]: W1003 11:19:00.665525 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a0027a1_059e_49c9_91f3_0781aafd3c1e.slice/crio-fc121a594d089688d76156340c72a32c60cee8338bba4bc8e245cbd0b24c28e6 WatchSource:0}: Error finding container fc121a594d089688d76156340c72a32c60cee8338bba4bc8e245cbd0b24c28e6: Status 404 returned error can't find the container with id fc121a594d089688d76156340c72a32c60cee8338bba4bc8e245cbd0b24c28e6 Oct 03 11:19:00 crc kubenswrapper[4990]: I1003 11:19:00.743023 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0027a1-059e-49c9-91f3-0781aafd3c1e","Type":"ContainerStarted","Data":"fc121a594d089688d76156340c72a32c60cee8338bba4bc8e245cbd0b24c28e6"} Oct 03 11:19:00 crc kubenswrapper[4990]: I1003 11:19:00.881741 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="466430e5-4fa3-406b-8d17-bb196e3a46cc" path="/var/lib/kubelet/pods/466430e5-4fa3-406b-8d17-bb196e3a46cc/volumes" Oct 03 11:19:01 crc kubenswrapper[4990]: I1003 11:19:01.760879 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0027a1-059e-49c9-91f3-0781aafd3c1e","Type":"ContainerStarted","Data":"9f0147ca714db6e66ae815ca7586179b3b160dcf0f90124e56efc34c7d17c6e0"} Oct 03 11:19:01 crc kubenswrapper[4990]: I1003 11:19:01.762181 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 11:19:01 crc kubenswrapper[4990]: I1003 11:19:01.762277 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0027a1-059e-49c9-91f3-0781aafd3c1e","Type":"ContainerStarted","Data":"2a436b4d5dc4befa1e1a6be2d67d03050da869ade61d13258aded031788e2b3e"} Oct 03 11:19:01 crc kubenswrapper[4990]: I1003 11:19:01.793427 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.793412337 podStartE2EDuration="2.793412337s" podCreationTimestamp="2025-10-03 11:18:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:19:01.786308245 +0000 UTC m=+5723.582940112" watchObservedRunningTime="2025-10-03 11:19:01.793412337 +0000 UTC m=+5723.590044194" Oct 03 11:19:06 crc kubenswrapper[4990]: I1003 11:19:06.333761 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:19:06 crc kubenswrapper[4990]: I1003 11:19:06.415186 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f55767c85-mmg7w"] Oct 03 11:19:06 crc kubenswrapper[4990]: I1003 11:19:06.415929 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" podUID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" containerName="dnsmasq-dns" containerID="cri-o://1867575ee49c1c0ef0338fa9444c73c2615236cdc96a2566c35141a06b7d555c" gracePeriod=10 Oct 03 11:19:06 crc kubenswrapper[4990]: I1003 11:19:06.812897 4990 generic.go:334] "Generic (PLEG): container finished" podID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" containerID="1867575ee49c1c0ef0338fa9444c73c2615236cdc96a2566c35141a06b7d555c" exitCode=0 Oct 03 11:19:06 crc kubenswrapper[4990]: I1003 11:19:06.812958 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" event={"ID":"7072233f-3e4a-4dd0-b857-4a107eb3d1e3","Type":"ContainerDied","Data":"1867575ee49c1c0ef0338fa9444c73c2615236cdc96a2566c35141a06b7d555c"} Oct 03 11:19:06 crc kubenswrapper[4990]: I1003 11:19:06.812996 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" event={"ID":"7072233f-3e4a-4dd0-b857-4a107eb3d1e3","Type":"ContainerDied","Data":"3f4f58dbf4c3226816b84cd8a017b23cce8a21a3e414075881aa4ce4cf01e472"} Oct 03 11:19:06 crc kubenswrapper[4990]: I1003 11:19:06.813016 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f4f58dbf4c3226816b84cd8a017b23cce8a21a3e414075881aa4ce4cf01e472" Oct 03 11:19:06 crc kubenswrapper[4990]: I1003 11:19:06.874808 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.016286 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-sb\") pod \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.016343 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-config\") pod \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.016429 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-nb\") pod \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.016476 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c6lc\" (UniqueName: \"kubernetes.io/projected/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-kube-api-access-5c6lc\") pod \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.016534 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-dns-svc\") pod \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\" (UID: \"7072233f-3e4a-4dd0-b857-4a107eb3d1e3\") " Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.022301 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-kube-api-access-5c6lc" (OuterVolumeSpecName: "kube-api-access-5c6lc") pod "7072233f-3e4a-4dd0-b857-4a107eb3d1e3" (UID: "7072233f-3e4a-4dd0-b857-4a107eb3d1e3"). InnerVolumeSpecName "kube-api-access-5c6lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.062378 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7072233f-3e4a-4dd0-b857-4a107eb3d1e3" (UID: "7072233f-3e4a-4dd0-b857-4a107eb3d1e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.077588 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-config" (OuterVolumeSpecName: "config") pod "7072233f-3e4a-4dd0-b857-4a107eb3d1e3" (UID: "7072233f-3e4a-4dd0-b857-4a107eb3d1e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.079360 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7072233f-3e4a-4dd0-b857-4a107eb3d1e3" (UID: "7072233f-3e4a-4dd0-b857-4a107eb3d1e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.080986 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7072233f-3e4a-4dd0-b857-4a107eb3d1e3" (UID: "7072233f-3e4a-4dd0-b857-4a107eb3d1e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.119403 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.119659 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.119777 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.119852 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c6lc\" (UniqueName: \"kubernetes.io/projected/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-kube-api-access-5c6lc\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.119911 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7072233f-3e4a-4dd0-b857-4a107eb3d1e3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.826469 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f55767c85-mmg7w" Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.881815 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f55767c85-mmg7w"] Oct 03 11:19:07 crc kubenswrapper[4990]: I1003 11:19:07.889790 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f55767c85-mmg7w"] Oct 03 11:19:08 crc kubenswrapper[4990]: I1003 11:19:08.894413 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" path="/var/lib/kubelet/pods/7072233f-3e4a-4dd0-b857-4a107eb3d1e3/volumes" Oct 03 11:19:09 crc kubenswrapper[4990]: I1003 11:19:09.872104 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:19:09 crc kubenswrapper[4990]: E1003 11:19:09.872494 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:19:11 crc kubenswrapper[4990]: I1003 11:19:11.845265 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 11:19:20 crc kubenswrapper[4990]: I1003 11:19:20.871797 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:19:20 crc kubenswrapper[4990]: E1003 11:19:20.873903 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.176023 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:29 crc kubenswrapper[4990]: E1003 11:19:29.176831 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" containerName="init" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.176844 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" containerName="init" Oct 03 11:19:29 crc kubenswrapper[4990]: E1003 11:19:29.176878 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" containerName="dnsmasq-dns" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.176884 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" containerName="dnsmasq-dns" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.177036 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7072233f-3e4a-4dd0-b857-4a107eb3d1e3" containerName="dnsmasq-dns" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.177935 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.179744 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.183126 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.310033 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.310392 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.310465 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.310504 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.310548 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clfqq\" (UniqueName: \"kubernetes.io/projected/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-kube-api-access-clfqq\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.310629 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.413044 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.413150 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.413319 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.413457 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.413612 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.413668 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clfqq\" (UniqueName: \"kubernetes.io/projected/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-kube-api-access-clfqq\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.413798 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.421160 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.422305 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.423081 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.423467 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.432855 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clfqq\" (UniqueName: \"kubernetes.io/projected/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-kube-api-access-clfqq\") pod \"cinder-scheduler-0\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:29 crc kubenswrapper[4990]: I1003 11:19:29.510703 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 11:19:30 crc kubenswrapper[4990]: I1003 11:19:30.008291 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:30 crc kubenswrapper[4990]: I1003 11:19:30.073335 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a5bcfb-c9fa-4ed1-933b-514b512c3000","Type":"ContainerStarted","Data":"a04f38a295653c9da46720d876d4d7eaddb4a2efce14f6022da157a6bfd922e0"} Oct 03 11:19:30 crc kubenswrapper[4990]: I1003 11:19:30.635446 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:19:30 crc kubenswrapper[4990]: I1003 11:19:30.636034 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerName="cinder-api-log" containerID="cri-o://2a436b4d5dc4befa1e1a6be2d67d03050da869ade61d13258aded031788e2b3e" gracePeriod=30 Oct 03 11:19:30 crc kubenswrapper[4990]: I1003 11:19:30.636146 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerName="cinder-api" containerID="cri-o://9f0147ca714db6e66ae815ca7586179b3b160dcf0f90124e56efc34c7d17c6e0" gracePeriod=30 Oct 03 11:19:31 crc kubenswrapper[4990]: I1003 11:19:31.084214 4990 generic.go:334] "Generic (PLEG): container finished" podID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerID="2a436b4d5dc4befa1e1a6be2d67d03050da869ade61d13258aded031788e2b3e" exitCode=143 Oct 03 11:19:31 crc kubenswrapper[4990]: I1003 11:19:31.084314 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0027a1-059e-49c9-91f3-0781aafd3c1e","Type":"ContainerDied","Data":"2a436b4d5dc4befa1e1a6be2d67d03050da869ade61d13258aded031788e2b3e"} Oct 03 11:19:31 crc kubenswrapper[4990]: I1003 11:19:31.086050 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a5bcfb-c9fa-4ed1-933b-514b512c3000","Type":"ContainerStarted","Data":"a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887"} Oct 03 11:19:32 crc kubenswrapper[4990]: I1003 11:19:32.095824 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a5bcfb-c9fa-4ed1-933b-514b512c3000","Type":"ContainerStarted","Data":"27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f"} Oct 03 11:19:32 crc kubenswrapper[4990]: I1003 11:19:32.116488 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.1164705 podStartE2EDuration="3.1164705s" podCreationTimestamp="2025-10-03 11:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:19:32.113447262 +0000 UTC m=+5753.910079119" watchObservedRunningTime="2025-10-03 11:19:32.1164705 +0000 UTC m=+5753.913102357" Oct 03 11:19:32 crc kubenswrapper[4990]: I1003 11:19:32.872681 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:19:32 crc kubenswrapper[4990]: E1003 11:19:32.873276 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.127846 4990 generic.go:334] "Generic (PLEG): container finished" podID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerID="9f0147ca714db6e66ae815ca7586179b3b160dcf0f90124e56efc34c7d17c6e0" exitCode=0 Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.127921 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0027a1-059e-49c9-91f3-0781aafd3c1e","Type":"ContainerDied","Data":"9f0147ca714db6e66ae815ca7586179b3b160dcf0f90124e56efc34c7d17c6e0"} Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.272769 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408153 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-public-tls-certs\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408195 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-internal-tls-certs\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408210 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data-custom\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408270 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-combined-ca-bundle\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408291 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-scripts\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408317 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0027a1-059e-49c9-91f3-0781aafd3c1e-logs\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408435 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408456 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cbr8\" (UniqueName: \"kubernetes.io/projected/6a0027a1-059e-49c9-91f3-0781aafd3c1e-kube-api-access-6cbr8\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408475 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0027a1-059e-49c9-91f3-0781aafd3c1e-etc-machine-id\") pod \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\" (UID: \"6a0027a1-059e-49c9-91f3-0781aafd3c1e\") " Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408611 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a0027a1-059e-49c9-91f3-0781aafd3c1e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.408918 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0027a1-059e-49c9-91f3-0781aafd3c1e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.409972 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0027a1-059e-49c9-91f3-0781aafd3c1e-logs" (OuterVolumeSpecName: "logs") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.414230 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-scripts" (OuterVolumeSpecName: "scripts") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.414664 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0027a1-059e-49c9-91f3-0781aafd3c1e-kube-api-access-6cbr8" (OuterVolumeSpecName: "kube-api-access-6cbr8") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "kube-api-access-6cbr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.418987 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.439769 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.462294 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data" (OuterVolumeSpecName: "config-data") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.464333 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.480178 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6a0027a1-059e-49c9-91f3-0781aafd3c1e" (UID: "6a0027a1-059e-49c9-91f3-0781aafd3c1e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511329 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511372 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511384 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511395 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511407 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511418 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0027a1-059e-49c9-91f3-0781aafd3c1e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511427 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0027a1-059e-49c9-91f3-0781aafd3c1e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511438 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cbr8\" (UniqueName: \"kubernetes.io/projected/6a0027a1-059e-49c9-91f3-0781aafd3c1e-kube-api-access-6cbr8\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:34 crc kubenswrapper[4990]: I1003 11:19:34.511530 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.139376 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0027a1-059e-49c9-91f3-0781aafd3c1e","Type":"ContainerDied","Data":"fc121a594d089688d76156340c72a32c60cee8338bba4bc8e245cbd0b24c28e6"} Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.139722 4990 scope.go:117] "RemoveContainer" containerID="9f0147ca714db6e66ae815ca7586179b3b160dcf0f90124e56efc34c7d17c6e0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.139458 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.182092 4990 scope.go:117] "RemoveContainer" containerID="2a436b4d5dc4befa1e1a6be2d67d03050da869ade61d13258aded031788e2b3e" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.188322 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.216052 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.230910 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:19:35 crc kubenswrapper[4990]: E1003 11:19:35.231501 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerName="cinder-api" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.231569 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerName="cinder-api" Oct 03 11:19:35 crc kubenswrapper[4990]: E1003 11:19:35.231619 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerName="cinder-api-log" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.231633 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerName="cinder-api-log" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.231970 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerName="cinder-api-log" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.232015 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" containerName="cinder-api" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.233844 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.237239 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.237415 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.239915 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.241861 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.326466 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-scripts\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.326679 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pflsw\" (UniqueName: \"kubernetes.io/projected/a2fed11a-2d54-499e-b4ad-415a874de5dc-kube-api-access-pflsw\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.326811 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.326985 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-config-data-custom\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.327153 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.327197 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2fed11a-2d54-499e-b4ad-415a874de5dc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.327270 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-config-data\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.327328 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2fed11a-2d54-499e-b4ad-415a874de5dc-logs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.327430 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.429542 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.429742 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-scripts\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.429813 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pflsw\" (UniqueName: \"kubernetes.io/projected/a2fed11a-2d54-499e-b4ad-415a874de5dc-kube-api-access-pflsw\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.429913 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.430005 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-config-data-custom\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.430085 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.430125 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2fed11a-2d54-499e-b4ad-415a874de5dc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.430165 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-config-data\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.430204 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2fed11a-2d54-499e-b4ad-415a874de5dc-logs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.430298 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2fed11a-2d54-499e-b4ad-415a874de5dc-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.430989 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2fed11a-2d54-499e-b4ad-415a874de5dc-logs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.434252 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.434745 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-config-data-custom\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.435842 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.436551 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-config-data\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.437074 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.438432 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2fed11a-2d54-499e-b4ad-415a874de5dc-scripts\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.447621 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pflsw\" (UniqueName: \"kubernetes.io/projected/a2fed11a-2d54-499e-b4ad-415a874de5dc-kube-api-access-pflsw\") pod \"cinder-api-0\" (UID: \"a2fed11a-2d54-499e-b4ad-415a874de5dc\") " pod="openstack/cinder-api-0" Oct 03 11:19:35 crc kubenswrapper[4990]: I1003 11:19:35.555181 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 11:19:36 crc kubenswrapper[4990]: I1003 11:19:36.014303 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 11:19:36 crc kubenswrapper[4990]: W1003 11:19:36.022828 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2fed11a_2d54_499e_b4ad_415a874de5dc.slice/crio-11af35c496a47f7116e5f1ed91317db4df2d9abcadefd15113fc0b3acd562166 WatchSource:0}: Error finding container 11af35c496a47f7116e5f1ed91317db4df2d9abcadefd15113fc0b3acd562166: Status 404 returned error can't find the container with id 11af35c496a47f7116e5f1ed91317db4df2d9abcadefd15113fc0b3acd562166 Oct 03 11:19:36 crc kubenswrapper[4990]: I1003 11:19:36.151615 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a2fed11a-2d54-499e-b4ad-415a874de5dc","Type":"ContainerStarted","Data":"11af35c496a47f7116e5f1ed91317db4df2d9abcadefd15113fc0b3acd562166"} Oct 03 11:19:36 crc kubenswrapper[4990]: I1003 11:19:36.887266 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0027a1-059e-49c9-91f3-0781aafd3c1e" path="/var/lib/kubelet/pods/6a0027a1-059e-49c9-91f3-0781aafd3c1e/volumes" Oct 03 11:19:37 crc kubenswrapper[4990]: I1003 11:19:37.162943 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a2fed11a-2d54-499e-b4ad-415a874de5dc","Type":"ContainerStarted","Data":"ccaa5e09d2da332253960f0b58a410e17c767d3b3e48ef77b279118864e9f5e7"} Oct 03 11:19:38 crc kubenswrapper[4990]: I1003 11:19:38.175703 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a2fed11a-2d54-499e-b4ad-415a874de5dc","Type":"ContainerStarted","Data":"3b078d17017741df250d476b88cf23f1c9de7624a1d53bb90cdab4579b0dd921"} Oct 03 11:19:38 crc kubenswrapper[4990]: I1003 11:19:38.176096 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 11:19:38 crc kubenswrapper[4990]: I1003 11:19:38.212328 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.212309316 podStartE2EDuration="3.212309316s" podCreationTimestamp="2025-10-03 11:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:19:38.206927548 +0000 UTC m=+5760.003559415" watchObservedRunningTime="2025-10-03 11:19:38.212309316 +0000 UTC m=+5760.008941173" Oct 03 11:19:39 crc kubenswrapper[4990]: I1003 11:19:39.741278 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 11:19:39 crc kubenswrapper[4990]: I1003 11:19:39.802358 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:40 crc kubenswrapper[4990]: I1003 11:19:40.192198 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerName="cinder-scheduler" containerID="cri-o://a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887" gracePeriod=30 Oct 03 11:19:40 crc kubenswrapper[4990]: I1003 11:19:40.192296 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerName="probe" containerID="cri-o://27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f" gracePeriod=30 Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.210072 4990 generic.go:334] "Generic (PLEG): container finished" podID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerID="27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f" exitCode=0 Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.210190 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a5bcfb-c9fa-4ed1-933b-514b512c3000","Type":"ContainerDied","Data":"27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f"} Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.774710 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.969599 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-scripts\") pod \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.969639 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-etc-machine-id\") pod \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.969689 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-combined-ca-bundle\") pod \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.969757 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data\") pod \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.969808 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e7a5bcfb-c9fa-4ed1-933b-514b512c3000" (UID: "e7a5bcfb-c9fa-4ed1-933b-514b512c3000"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.970188 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data-custom\") pod \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.970268 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clfqq\" (UniqueName: \"kubernetes.io/projected/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-kube-api-access-clfqq\") pod \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\" (UID: \"e7a5bcfb-c9fa-4ed1-933b-514b512c3000\") " Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.970910 4990 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.977122 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-kube-api-access-clfqq" (OuterVolumeSpecName: "kube-api-access-clfqq") pod "e7a5bcfb-c9fa-4ed1-933b-514b512c3000" (UID: "e7a5bcfb-c9fa-4ed1-933b-514b512c3000"). InnerVolumeSpecName "kube-api-access-clfqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.977215 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e7a5bcfb-c9fa-4ed1-933b-514b512c3000" (UID: "e7a5bcfb-c9fa-4ed1-933b-514b512c3000"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:41 crc kubenswrapper[4990]: I1003 11:19:41.977940 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-scripts" (OuterVolumeSpecName: "scripts") pod "e7a5bcfb-c9fa-4ed1-933b-514b512c3000" (UID: "e7a5bcfb-c9fa-4ed1-933b-514b512c3000"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.035314 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7a5bcfb-c9fa-4ed1-933b-514b512c3000" (UID: "e7a5bcfb-c9fa-4ed1-933b-514b512c3000"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.073966 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.074112 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clfqq\" (UniqueName: \"kubernetes.io/projected/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-kube-api-access-clfqq\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.074322 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.074390 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.079669 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data" (OuterVolumeSpecName: "config-data") pod "e7a5bcfb-c9fa-4ed1-933b-514b512c3000" (UID: "e7a5bcfb-c9fa-4ed1-933b-514b512c3000"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.175616 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7a5bcfb-c9fa-4ed1-933b-514b512c3000-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.224918 4990 generic.go:334] "Generic (PLEG): container finished" podID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerID="a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887" exitCode=0 Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.224991 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.225930 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a5bcfb-c9fa-4ed1-933b-514b512c3000","Type":"ContainerDied","Data":"a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887"} Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.226044 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7a5bcfb-c9fa-4ed1-933b-514b512c3000","Type":"ContainerDied","Data":"a04f38a295653c9da46720d876d4d7eaddb4a2efce14f6022da157a6bfd922e0"} Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.226115 4990 scope.go:117] "RemoveContainer" containerID="27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.270612 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.275217 4990 scope.go:117] "RemoveContainer" containerID="a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.283987 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.291302 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:42 crc kubenswrapper[4990]: E1003 11:19:42.291746 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerName="probe" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.291770 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerName="probe" Oct 03 11:19:42 crc kubenswrapper[4990]: E1003 11:19:42.291796 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerName="cinder-scheduler" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.291805 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerName="cinder-scheduler" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.292028 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerName="cinder-scheduler" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.292056 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" containerName="probe" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.293206 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.303755 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.325336 4990 scope.go:117] "RemoveContainer" containerID="27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.325900 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:42 crc kubenswrapper[4990]: E1003 11:19:42.327448 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f\": container with ID starting with 27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f not found: ID does not exist" containerID="27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.327505 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f"} err="failed to get container status \"27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f\": rpc error: code = NotFound desc = could not find container \"27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f\": container with ID starting with 27ed313ab66cafefc318ff20f0ca8d410827bef4543ceb61f7ef7dacbd315f5f not found: ID does not exist" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.327558 4990 scope.go:117] "RemoveContainer" containerID="a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887" Oct 03 11:19:42 crc kubenswrapper[4990]: E1003 11:19:42.330728 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887\": container with ID starting with a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887 not found: ID does not exist" containerID="a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.330770 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887"} err="failed to get container status \"a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887\": rpc error: code = NotFound desc = could not find container \"a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887\": container with ID starting with a65cd5ad893702d40e27dca49c0012ca1361c712d3d96408d08eab40a868c887 not found: ID does not exist" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.481840 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw9hl\" (UniqueName: \"kubernetes.io/projected/24ec0c7b-2701-4816-9a36-771572d653f0-kube-api-access-gw9hl\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.481931 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.482203 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24ec0c7b-2701-4816-9a36-771572d653f0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.482294 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.482788 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-scripts\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.482982 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-config-data\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.585173 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.585382 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24ec0c7b-2701-4816-9a36-771572d653f0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.585451 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.585507 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24ec0c7b-2701-4816-9a36-771572d653f0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.585564 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-scripts\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.585775 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-config-data\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.585884 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw9hl\" (UniqueName: \"kubernetes.io/projected/24ec0c7b-2701-4816-9a36-771572d653f0-kube-api-access-gw9hl\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.589308 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.590638 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-scripts\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.591035 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.591188 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ec0c7b-2701-4816-9a36-771572d653f0-config-data\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.609833 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw9hl\" (UniqueName: \"kubernetes.io/projected/24ec0c7b-2701-4816-9a36-771572d653f0-kube-api-access-gw9hl\") pod \"cinder-scheduler-0\" (UID: \"24ec0c7b-2701-4816-9a36-771572d653f0\") " pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.637609 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.895135 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a5bcfb-c9fa-4ed1-933b-514b512c3000" path="/var/lib/kubelet/pods/e7a5bcfb-c9fa-4ed1-933b-514b512c3000/volumes" Oct 03 11:19:42 crc kubenswrapper[4990]: I1003 11:19:42.912030 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 11:19:42 crc kubenswrapper[4990]: W1003 11:19:42.924311 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24ec0c7b_2701_4816_9a36_771572d653f0.slice/crio-c58cdf13da9fffd2d50b83ccf8086200f5cb54c5a13d470e738e707975b87557 WatchSource:0}: Error finding container c58cdf13da9fffd2d50b83ccf8086200f5cb54c5a13d470e738e707975b87557: Status 404 returned error can't find the container with id c58cdf13da9fffd2d50b83ccf8086200f5cb54c5a13d470e738e707975b87557 Oct 03 11:19:43 crc kubenswrapper[4990]: I1003 11:19:43.238202 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24ec0c7b-2701-4816-9a36-771572d653f0","Type":"ContainerStarted","Data":"c58cdf13da9fffd2d50b83ccf8086200f5cb54c5a13d470e738e707975b87557"} Oct 03 11:19:43 crc kubenswrapper[4990]: I1003 11:19:43.872371 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:19:43 crc kubenswrapper[4990]: E1003 11:19:43.872950 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:19:44 crc kubenswrapper[4990]: I1003 11:19:44.252822 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24ec0c7b-2701-4816-9a36-771572d653f0","Type":"ContainerStarted","Data":"f22cc8c1a0637dd6d135f1f787930641a127131c72b05963313ee1c85f58a345"} Oct 03 11:19:44 crc kubenswrapper[4990]: I1003 11:19:44.253286 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24ec0c7b-2701-4816-9a36-771572d653f0","Type":"ContainerStarted","Data":"cb4998db5645b9a0f0255e798bffaacc82245292358b5196f499d2f28f6a8968"} Oct 03 11:19:44 crc kubenswrapper[4990]: I1003 11:19:44.282651 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.28262818 podStartE2EDuration="2.28262818s" podCreationTimestamp="2025-10-03 11:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:19:44.274727587 +0000 UTC m=+5766.071359454" watchObservedRunningTime="2025-10-03 11:19:44.28262818 +0000 UTC m=+5766.079260067" Oct 03 11:19:47 crc kubenswrapper[4990]: I1003 11:19:47.406783 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 11:19:47 crc kubenswrapper[4990]: I1003 11:19:47.638078 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 11:19:48 crc kubenswrapper[4990]: I1003 11:19:48.643020 4990 scope.go:117] "RemoveContainer" containerID="26774592c48626448f932d8de38e128d2946d821756d409380b14325e928bb90" Oct 03 11:19:48 crc kubenswrapper[4990]: I1003 11:19:48.685494 4990 scope.go:117] "RemoveContainer" containerID="57f1fd2d131c7c55fa3da8fa6fd8c8784a7f06974761dc67aabc251389e14739" Oct 03 11:19:48 crc kubenswrapper[4990]: I1003 11:19:48.713572 4990 scope.go:117] "RemoveContainer" containerID="7e4359c799ab7a94d3ed20038926338225291d8029ceabfe4f9bec7fbbe204cc" Oct 03 11:19:52 crc kubenswrapper[4990]: I1003 11:19:52.845828 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 11:19:57 crc kubenswrapper[4990]: I1003 11:19:57.872007 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:19:57 crc kubenswrapper[4990]: E1003 11:19:57.872596 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:19:58 crc kubenswrapper[4990]: I1003 11:19:58.314770 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-6cq5h"] Oct 03 11:19:58 crc kubenswrapper[4990]: I1003 11:19:58.316844 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6cq5h" Oct 03 11:19:58 crc kubenswrapper[4990]: I1003 11:19:58.331416 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6cq5h"] Oct 03 11:19:58 crc kubenswrapper[4990]: I1003 11:19:58.489477 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgfnt\" (UniqueName: \"kubernetes.io/projected/c78899a2-aa4b-490f-a0b5-8b1de2c07012-kube-api-access-wgfnt\") pod \"glance-db-create-6cq5h\" (UID: \"c78899a2-aa4b-490f-a0b5-8b1de2c07012\") " pod="openstack/glance-db-create-6cq5h" Oct 03 11:19:58 crc kubenswrapper[4990]: I1003 11:19:58.592307 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgfnt\" (UniqueName: \"kubernetes.io/projected/c78899a2-aa4b-490f-a0b5-8b1de2c07012-kube-api-access-wgfnt\") pod \"glance-db-create-6cq5h\" (UID: \"c78899a2-aa4b-490f-a0b5-8b1de2c07012\") " pod="openstack/glance-db-create-6cq5h" Oct 03 11:19:58 crc kubenswrapper[4990]: I1003 11:19:58.622914 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgfnt\" (UniqueName: \"kubernetes.io/projected/c78899a2-aa4b-490f-a0b5-8b1de2c07012-kube-api-access-wgfnt\") pod \"glance-db-create-6cq5h\" (UID: \"c78899a2-aa4b-490f-a0b5-8b1de2c07012\") " pod="openstack/glance-db-create-6cq5h" Oct 03 11:19:58 crc kubenswrapper[4990]: I1003 11:19:58.663545 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6cq5h" Oct 03 11:19:58 crc kubenswrapper[4990]: I1003 11:19:58.938631 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6cq5h"] Oct 03 11:19:59 crc kubenswrapper[4990]: I1003 11:19:59.417780 4990 generic.go:334] "Generic (PLEG): container finished" podID="c78899a2-aa4b-490f-a0b5-8b1de2c07012" containerID="43ad5333a9497a376d4980a650d1e111dd599e0ce8d1a9e8136b8c09a8fcc1b1" exitCode=0 Oct 03 11:19:59 crc kubenswrapper[4990]: I1003 11:19:59.417838 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6cq5h" event={"ID":"c78899a2-aa4b-490f-a0b5-8b1de2c07012","Type":"ContainerDied","Data":"43ad5333a9497a376d4980a650d1e111dd599e0ce8d1a9e8136b8c09a8fcc1b1"} Oct 03 11:19:59 crc kubenswrapper[4990]: I1003 11:19:59.417865 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6cq5h" event={"ID":"c78899a2-aa4b-490f-a0b5-8b1de2c07012","Type":"ContainerStarted","Data":"a487b50658a5a44085f373300511294beeacc138dae36f231c53c137357e3b95"} Oct 03 11:20:00 crc kubenswrapper[4990]: I1003 11:20:00.846907 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6cq5h" Oct 03 11:20:00 crc kubenswrapper[4990]: I1003 11:20:00.938405 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgfnt\" (UniqueName: \"kubernetes.io/projected/c78899a2-aa4b-490f-a0b5-8b1de2c07012-kube-api-access-wgfnt\") pod \"c78899a2-aa4b-490f-a0b5-8b1de2c07012\" (UID: \"c78899a2-aa4b-490f-a0b5-8b1de2c07012\") " Oct 03 11:20:00 crc kubenswrapper[4990]: I1003 11:20:00.944523 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c78899a2-aa4b-490f-a0b5-8b1de2c07012-kube-api-access-wgfnt" (OuterVolumeSpecName: "kube-api-access-wgfnt") pod "c78899a2-aa4b-490f-a0b5-8b1de2c07012" (UID: "c78899a2-aa4b-490f-a0b5-8b1de2c07012"). InnerVolumeSpecName "kube-api-access-wgfnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:20:01 crc kubenswrapper[4990]: I1003 11:20:01.040683 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgfnt\" (UniqueName: \"kubernetes.io/projected/c78899a2-aa4b-490f-a0b5-8b1de2c07012-kube-api-access-wgfnt\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:01 crc kubenswrapper[4990]: I1003 11:20:01.439385 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6cq5h" event={"ID":"c78899a2-aa4b-490f-a0b5-8b1de2c07012","Type":"ContainerDied","Data":"a487b50658a5a44085f373300511294beeacc138dae36f231c53c137357e3b95"} Oct 03 11:20:01 crc kubenswrapper[4990]: I1003 11:20:01.439427 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a487b50658a5a44085f373300511294beeacc138dae36f231c53c137357e3b95" Oct 03 11:20:01 crc kubenswrapper[4990]: I1003 11:20:01.439446 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6cq5h" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.444865 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8080-account-create-5kpjw"] Oct 03 11:20:08 crc kubenswrapper[4990]: E1003 11:20:08.446237 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78899a2-aa4b-490f-a0b5-8b1de2c07012" containerName="mariadb-database-create" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.446264 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78899a2-aa4b-490f-a0b5-8b1de2c07012" containerName="mariadb-database-create" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.446717 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c78899a2-aa4b-490f-a0b5-8b1de2c07012" containerName="mariadb-database-create" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.447880 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8080-account-create-5kpjw" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.451151 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.468806 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8080-account-create-5kpjw"] Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.589367 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r7kf\" (UniqueName: \"kubernetes.io/projected/6289514c-49de-491b-8f6a-f2f300f4ec76-kube-api-access-7r7kf\") pod \"glance-8080-account-create-5kpjw\" (UID: \"6289514c-49de-491b-8f6a-f2f300f4ec76\") " pod="openstack/glance-8080-account-create-5kpjw" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.691547 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r7kf\" (UniqueName: \"kubernetes.io/projected/6289514c-49de-491b-8f6a-f2f300f4ec76-kube-api-access-7r7kf\") pod \"glance-8080-account-create-5kpjw\" (UID: \"6289514c-49de-491b-8f6a-f2f300f4ec76\") " pod="openstack/glance-8080-account-create-5kpjw" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.715722 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r7kf\" (UniqueName: \"kubernetes.io/projected/6289514c-49de-491b-8f6a-f2f300f4ec76-kube-api-access-7r7kf\") pod \"glance-8080-account-create-5kpjw\" (UID: \"6289514c-49de-491b-8f6a-f2f300f4ec76\") " pod="openstack/glance-8080-account-create-5kpjw" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.793141 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8080-account-create-5kpjw" Oct 03 11:20:08 crc kubenswrapper[4990]: I1003 11:20:08.882285 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:20:08 crc kubenswrapper[4990]: E1003 11:20:08.882615 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:20:09 crc kubenswrapper[4990]: I1003 11:20:09.323307 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8080-account-create-5kpjw"] Oct 03 11:20:09 crc kubenswrapper[4990]: W1003 11:20:09.329930 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6289514c_49de_491b_8f6a_f2f300f4ec76.slice/crio-eaa65e414ca88b2a3d152da47a70342cb13b977d6a1a5f9a2cfea71085e47d22 WatchSource:0}: Error finding container eaa65e414ca88b2a3d152da47a70342cb13b977d6a1a5f9a2cfea71085e47d22: Status 404 returned error can't find the container with id eaa65e414ca88b2a3d152da47a70342cb13b977d6a1a5f9a2cfea71085e47d22 Oct 03 11:20:09 crc kubenswrapper[4990]: I1003 11:20:09.523590 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8080-account-create-5kpjw" event={"ID":"6289514c-49de-491b-8f6a-f2f300f4ec76","Type":"ContainerStarted","Data":"eaa65e414ca88b2a3d152da47a70342cb13b977d6a1a5f9a2cfea71085e47d22"} Oct 03 11:20:10 crc kubenswrapper[4990]: I1003 11:20:10.534685 4990 generic.go:334] "Generic (PLEG): container finished" podID="6289514c-49de-491b-8f6a-f2f300f4ec76" containerID="94576bebfbb3aafab588bc36e3e6ee0e45cf7dee92250bd249af2da3c3cd0617" exitCode=0 Oct 03 11:20:10 crc kubenswrapper[4990]: I1003 11:20:10.534750 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8080-account-create-5kpjw" event={"ID":"6289514c-49de-491b-8f6a-f2f300f4ec76","Type":"ContainerDied","Data":"94576bebfbb3aafab588bc36e3e6ee0e45cf7dee92250bd249af2da3c3cd0617"} Oct 03 11:20:11 crc kubenswrapper[4990]: I1003 11:20:11.952297 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8080-account-create-5kpjw" Oct 03 11:20:12 crc kubenswrapper[4990]: I1003 11:20:12.088896 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r7kf\" (UniqueName: \"kubernetes.io/projected/6289514c-49de-491b-8f6a-f2f300f4ec76-kube-api-access-7r7kf\") pod \"6289514c-49de-491b-8f6a-f2f300f4ec76\" (UID: \"6289514c-49de-491b-8f6a-f2f300f4ec76\") " Oct 03 11:20:12 crc kubenswrapper[4990]: I1003 11:20:12.094067 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6289514c-49de-491b-8f6a-f2f300f4ec76-kube-api-access-7r7kf" (OuterVolumeSpecName: "kube-api-access-7r7kf") pod "6289514c-49de-491b-8f6a-f2f300f4ec76" (UID: "6289514c-49de-491b-8f6a-f2f300f4ec76"). InnerVolumeSpecName "kube-api-access-7r7kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:20:12 crc kubenswrapper[4990]: I1003 11:20:12.190714 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r7kf\" (UniqueName: \"kubernetes.io/projected/6289514c-49de-491b-8f6a-f2f300f4ec76-kube-api-access-7r7kf\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:12 crc kubenswrapper[4990]: I1003 11:20:12.557695 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8080-account-create-5kpjw" event={"ID":"6289514c-49de-491b-8f6a-f2f300f4ec76","Type":"ContainerDied","Data":"eaa65e414ca88b2a3d152da47a70342cb13b977d6a1a5f9a2cfea71085e47d22"} Oct 03 11:20:12 crc kubenswrapper[4990]: I1003 11:20:12.558033 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8080-account-create-5kpjw" Oct 03 11:20:12 crc kubenswrapper[4990]: I1003 11:20:12.558062 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaa65e414ca88b2a3d152da47a70342cb13b977d6a1a5f9a2cfea71085e47d22" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.661685 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-w97ck"] Oct 03 11:20:13 crc kubenswrapper[4990]: E1003 11:20:13.662057 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6289514c-49de-491b-8f6a-f2f300f4ec76" containerName="mariadb-account-create" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.662070 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6289514c-49de-491b-8f6a-f2f300f4ec76" containerName="mariadb-account-create" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.662228 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6289514c-49de-491b-8f6a-f2f300f4ec76" containerName="mariadb-account-create" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.662793 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.664644 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-whwcz" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.664839 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.677507 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w97ck"] Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.821277 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-config-data\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.821327 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj79g\" (UniqueName: \"kubernetes.io/projected/6defce53-beca-45b7-9450-398c8ee12108-kube-api-access-tj79g\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.821366 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-db-sync-config-data\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.821419 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-combined-ca-bundle\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.922659 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-config-data\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.922960 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj79g\" (UniqueName: \"kubernetes.io/projected/6defce53-beca-45b7-9450-398c8ee12108-kube-api-access-tj79g\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.923317 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-db-sync-config-data\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.923470 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-combined-ca-bundle\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.927745 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-config-data\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.927745 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-db-sync-config-data\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.927850 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-combined-ca-bundle\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.943287 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj79g\" (UniqueName: \"kubernetes.io/projected/6defce53-beca-45b7-9450-398c8ee12108-kube-api-access-tj79g\") pod \"glance-db-sync-w97ck\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:13 crc kubenswrapper[4990]: I1003 11:20:13.983226 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:14 crc kubenswrapper[4990]: I1003 11:20:14.518132 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w97ck"] Oct 03 11:20:14 crc kubenswrapper[4990]: I1003 11:20:14.581549 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w97ck" event={"ID":"6defce53-beca-45b7-9450-398c8ee12108","Type":"ContainerStarted","Data":"a12c670e331e79f16ec72e1493d9d6d3a09260e219e85f8a53bd9a4dfe6cff4b"} Oct 03 11:20:15 crc kubenswrapper[4990]: I1003 11:20:15.595643 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w97ck" event={"ID":"6defce53-beca-45b7-9450-398c8ee12108","Type":"ContainerStarted","Data":"02a10f97eb136ec85f01e171348a05a0e8c51ab086a5c244e78e21921faeff03"} Oct 03 11:20:15 crc kubenswrapper[4990]: I1003 11:20:15.617208 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-w97ck" podStartSLOduration=2.617188801 podStartE2EDuration="2.617188801s" podCreationTimestamp="2025-10-03 11:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:20:15.608556838 +0000 UTC m=+5797.405188735" watchObservedRunningTime="2025-10-03 11:20:15.617188801 +0000 UTC m=+5797.413820658" Oct 03 11:20:18 crc kubenswrapper[4990]: I1003 11:20:18.620819 4990 generic.go:334] "Generic (PLEG): container finished" podID="6defce53-beca-45b7-9450-398c8ee12108" containerID="02a10f97eb136ec85f01e171348a05a0e8c51ab086a5c244e78e21921faeff03" exitCode=0 Oct 03 11:20:18 crc kubenswrapper[4990]: I1003 11:20:18.620908 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w97ck" event={"ID":"6defce53-beca-45b7-9450-398c8ee12108","Type":"ContainerDied","Data":"02a10f97eb136ec85f01e171348a05a0e8c51ab086a5c244e78e21921faeff03"} Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.110387 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.240351 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-config-data\") pod \"6defce53-beca-45b7-9450-398c8ee12108\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.240459 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj79g\" (UniqueName: \"kubernetes.io/projected/6defce53-beca-45b7-9450-398c8ee12108-kube-api-access-tj79g\") pod \"6defce53-beca-45b7-9450-398c8ee12108\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.240567 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-db-sync-config-data\") pod \"6defce53-beca-45b7-9450-398c8ee12108\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.240660 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-combined-ca-bundle\") pod \"6defce53-beca-45b7-9450-398c8ee12108\" (UID: \"6defce53-beca-45b7-9450-398c8ee12108\") " Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.249405 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6defce53-beca-45b7-9450-398c8ee12108-kube-api-access-tj79g" (OuterVolumeSpecName: "kube-api-access-tj79g") pod "6defce53-beca-45b7-9450-398c8ee12108" (UID: "6defce53-beca-45b7-9450-398c8ee12108"). InnerVolumeSpecName "kube-api-access-tj79g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.250782 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6defce53-beca-45b7-9450-398c8ee12108" (UID: "6defce53-beca-45b7-9450-398c8ee12108"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.294076 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6defce53-beca-45b7-9450-398c8ee12108" (UID: "6defce53-beca-45b7-9450-398c8ee12108"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.301717 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-config-data" (OuterVolumeSpecName: "config-data") pod "6defce53-beca-45b7-9450-398c8ee12108" (UID: "6defce53-beca-45b7-9450-398c8ee12108"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.343076 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj79g\" (UniqueName: \"kubernetes.io/projected/6defce53-beca-45b7-9450-398c8ee12108-kube-api-access-tj79g\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.343127 4990 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.343140 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.343154 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6defce53-beca-45b7-9450-398c8ee12108-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.642205 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w97ck" event={"ID":"6defce53-beca-45b7-9450-398c8ee12108","Type":"ContainerDied","Data":"a12c670e331e79f16ec72e1493d9d6d3a09260e219e85f8a53bd9a4dfe6cff4b"} Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.642244 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a12c670e331e79f16ec72e1493d9d6d3a09260e219e85f8a53bd9a4dfe6cff4b" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.642414 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w97ck" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.926566 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:20 crc kubenswrapper[4990]: E1003 11:20:20.927388 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6defce53-beca-45b7-9450-398c8ee12108" containerName="glance-db-sync" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.927404 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6defce53-beca-45b7-9450-398c8ee12108" containerName="glance-db-sync" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.927632 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6defce53-beca-45b7-9450-398c8ee12108" containerName="glance-db-sync" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.928830 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.931023 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.931080 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.934295 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-whwcz" Oct 03 11:20:20 crc kubenswrapper[4990]: I1003 11:20:20.956630 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.045451 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bb849867-hs62h"] Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.046955 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.054164 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-logs\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.054250 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kvn\" (UniqueName: \"kubernetes.io/projected/239a5b5d-704f-43d3-a8ed-a57daaa15210-kube-api-access-x8kvn\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.054297 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-config-data\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.054394 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-scripts\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.054417 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.054480 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.061867 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb849867-hs62h"] Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.111351 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.112736 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.115384 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.138384 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.155771 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kvn\" (UniqueName: \"kubernetes.io/projected/239a5b5d-704f-43d3-a8ed-a57daaa15210-kube-api-access-x8kvn\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156100 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-config-data\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156295 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156426 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjd49\" (UniqueName: \"kubernetes.io/projected/1f206a0b-569e-4d77-ac0b-f7a41672b39c-kube-api-access-gjd49\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156570 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156778 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-dns-svc\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156835 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-config\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156862 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-scripts\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156889 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.156966 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.157000 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-logs\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.157606 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-logs\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.157856 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.160944 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-scripts\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.163649 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-config-data\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.175984 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kvn\" (UniqueName: \"kubernetes.io/projected/239a5b5d-704f-43d3-a8ed-a57daaa15210-kube-api-access-x8kvn\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.187257 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.251457 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258370 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258438 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258458 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjd49\" (UniqueName: \"kubernetes.io/projected/1f206a0b-569e-4d77-ac0b-f7a41672b39c-kube-api-access-gjd49\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258474 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mslv\" (UniqueName: \"kubernetes.io/projected/2a1711f7-5e0e-4054-a342-da21cc35bbf0-kube-api-access-2mslv\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258490 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258538 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-dns-svc\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258554 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258577 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-config\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258595 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258624 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.258702 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.259600 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.263558 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-config\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.264208 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-dns-svc\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.267242 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.285373 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjd49\" (UniqueName: \"kubernetes.io/projected/1f206a0b-569e-4d77-ac0b-f7a41672b39c-kube-api-access-gjd49\") pod \"dnsmasq-dns-5bb849867-hs62h\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.360306 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.360669 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.360733 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.360790 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mslv\" (UniqueName: \"kubernetes.io/projected/2a1711f7-5e0e-4054-a342-da21cc35bbf0-kube-api-access-2mslv\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.360831 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.360866 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.363543 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.365591 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.366340 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.372374 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.372964 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.374318 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.385851 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mslv\" (UniqueName: \"kubernetes.io/projected/2a1711f7-5e0e-4054-a342-da21cc35bbf0-kube-api-access-2mslv\") pod \"glance-default-internal-api-0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.434128 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.833817 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb849867-hs62h"] Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.872919 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:20:21 crc kubenswrapper[4990]: E1003 11:20:21.873111 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:20:21 crc kubenswrapper[4990]: I1003 11:20:21.887708 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:21 crc kubenswrapper[4990]: W1003 11:20:21.898545 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod239a5b5d_704f_43d3_a8ed_a57daaa15210.slice/crio-db19654e336c30680dfa52856a43a33d599f209d4086d73456a1bff1802ba833 WatchSource:0}: Error finding container db19654e336c30680dfa52856a43a33d599f209d4086d73456a1bff1802ba833: Status 404 returned error can't find the container with id db19654e336c30680dfa52856a43a33d599f209d4086d73456a1bff1802ba833 Oct 03 11:20:22 crc kubenswrapper[4990]: I1003 11:20:22.104860 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:22 crc kubenswrapper[4990]: I1003 11:20:22.176779 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:22 crc kubenswrapper[4990]: W1003 11:20:22.180572 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a1711f7_5e0e_4054_a342_da21cc35bbf0.slice/crio-c2f887e4000d434d4e974635765247266807ec4e57041cf2741e6dc87f50275a WatchSource:0}: Error finding container c2f887e4000d434d4e974635765247266807ec4e57041cf2741e6dc87f50275a: Status 404 returned error can't find the container with id c2f887e4000d434d4e974635765247266807ec4e57041cf2741e6dc87f50275a Oct 03 11:20:22 crc kubenswrapper[4990]: I1003 11:20:22.677298 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"239a5b5d-704f-43d3-a8ed-a57daaa15210","Type":"ContainerStarted","Data":"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0"} Oct 03 11:20:22 crc kubenswrapper[4990]: I1003 11:20:22.677702 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"239a5b5d-704f-43d3-a8ed-a57daaa15210","Type":"ContainerStarted","Data":"db19654e336c30680dfa52856a43a33d599f209d4086d73456a1bff1802ba833"} Oct 03 11:20:22 crc kubenswrapper[4990]: I1003 11:20:22.679718 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a1711f7-5e0e-4054-a342-da21cc35bbf0","Type":"ContainerStarted","Data":"c2f887e4000d434d4e974635765247266807ec4e57041cf2741e6dc87f50275a"} Oct 03 11:20:22 crc kubenswrapper[4990]: I1003 11:20:22.681458 4990 generic.go:334] "Generic (PLEG): container finished" podID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" containerID="d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef" exitCode=0 Oct 03 11:20:22 crc kubenswrapper[4990]: I1003 11:20:22.681541 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb849867-hs62h" event={"ID":"1f206a0b-569e-4d77-ac0b-f7a41672b39c","Type":"ContainerDied","Data":"d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef"} Oct 03 11:20:22 crc kubenswrapper[4990]: I1003 11:20:22.681563 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb849867-hs62h" event={"ID":"1f206a0b-569e-4d77-ac0b-f7a41672b39c","Type":"ContainerStarted","Data":"dc7f405fa2b173085c59589ee19a97c6ea0b45ca47d5072f3aea4e4d57f8b1ae"} Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.401119 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.690374 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a1711f7-5e0e-4054-a342-da21cc35bbf0","Type":"ContainerStarted","Data":"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb"} Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.690782 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a1711f7-5e0e-4054-a342-da21cc35bbf0","Type":"ContainerStarted","Data":"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca"} Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.690599 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerName="glance-httpd" containerID="cri-o://9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb" gracePeriod=30 Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.690542 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerName="glance-log" containerID="cri-o://dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca" gracePeriod=30 Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.692862 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb849867-hs62h" event={"ID":"1f206a0b-569e-4d77-ac0b-f7a41672b39c","Type":"ContainerStarted","Data":"e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244"} Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.692927 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.695556 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"239a5b5d-704f-43d3-a8ed-a57daaa15210","Type":"ContainerStarted","Data":"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543"} Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.695690 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerName="glance-log" containerID="cri-o://2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0" gracePeriod=30 Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.695703 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerName="glance-httpd" containerID="cri-o://010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543" gracePeriod=30 Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.720230 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.720211822 podStartE2EDuration="2.720211822s" podCreationTimestamp="2025-10-03 11:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:20:23.716815944 +0000 UTC m=+5805.513447801" watchObservedRunningTime="2025-10-03 11:20:23.720211822 +0000 UTC m=+5805.516843679" Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.746700 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bb849867-hs62h" podStartSLOduration=2.746672795 podStartE2EDuration="2.746672795s" podCreationTimestamp="2025-10-03 11:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:20:23.738475084 +0000 UTC m=+5805.535106961" watchObservedRunningTime="2025-10-03 11:20:23.746672795 +0000 UTC m=+5805.543304672" Oct 03 11:20:23 crc kubenswrapper[4990]: I1003 11:20:23.769420 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.769402903 podStartE2EDuration="3.769402903s" podCreationTimestamp="2025-10-03 11:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:20:23.764920087 +0000 UTC m=+5805.561551944" watchObservedRunningTime="2025-10-03 11:20:23.769402903 +0000 UTC m=+5805.566034760" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.319390 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.452663 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-scripts\") pod \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.452831 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-combined-ca-bundle\") pod \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.452860 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-httpd-run\") pod \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.452890 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-config-data\") pod \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.452913 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-logs\") pod \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.452937 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mslv\" (UniqueName: \"kubernetes.io/projected/2a1711f7-5e0e-4054-a342-da21cc35bbf0-kube-api-access-2mslv\") pod \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\" (UID: \"2a1711f7-5e0e-4054-a342-da21cc35bbf0\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.453268 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-logs" (OuterVolumeSpecName: "logs") pod "2a1711f7-5e0e-4054-a342-da21cc35bbf0" (UID: "2a1711f7-5e0e-4054-a342-da21cc35bbf0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.453312 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2a1711f7-5e0e-4054-a342-da21cc35bbf0" (UID: "2a1711f7-5e0e-4054-a342-da21cc35bbf0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.453651 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.453667 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a1711f7-5e0e-4054-a342-da21cc35bbf0-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.458787 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1711f7-5e0e-4054-a342-da21cc35bbf0-kube-api-access-2mslv" (OuterVolumeSpecName: "kube-api-access-2mslv") pod "2a1711f7-5e0e-4054-a342-da21cc35bbf0" (UID: "2a1711f7-5e0e-4054-a342-da21cc35bbf0"). InnerVolumeSpecName "kube-api-access-2mslv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.458914 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-scripts" (OuterVolumeSpecName: "scripts") pod "2a1711f7-5e0e-4054-a342-da21cc35bbf0" (UID: "2a1711f7-5e0e-4054-a342-da21cc35bbf0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.480692 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a1711f7-5e0e-4054-a342-da21cc35bbf0" (UID: "2a1711f7-5e0e-4054-a342-da21cc35bbf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.523798 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-config-data" (OuterVolumeSpecName: "config-data") pod "2a1711f7-5e0e-4054-a342-da21cc35bbf0" (UID: "2a1711f7-5e0e-4054-a342-da21cc35bbf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.555333 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.555385 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.555404 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mslv\" (UniqueName: \"kubernetes.io/projected/2a1711f7-5e0e-4054-a342-da21cc35bbf0-kube-api-access-2mslv\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.555424 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1711f7-5e0e-4054-a342-da21cc35bbf0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.572336 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.656183 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8kvn\" (UniqueName: \"kubernetes.io/projected/239a5b5d-704f-43d3-a8ed-a57daaa15210-kube-api-access-x8kvn\") pod \"239a5b5d-704f-43d3-a8ed-a57daaa15210\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.656230 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-logs\") pod \"239a5b5d-704f-43d3-a8ed-a57daaa15210\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.656328 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-scripts\") pod \"239a5b5d-704f-43d3-a8ed-a57daaa15210\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.656362 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-config-data\") pod \"239a5b5d-704f-43d3-a8ed-a57daaa15210\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.656385 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-httpd-run\") pod \"239a5b5d-704f-43d3-a8ed-a57daaa15210\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.656544 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-combined-ca-bundle\") pod \"239a5b5d-704f-43d3-a8ed-a57daaa15210\" (UID: \"239a5b5d-704f-43d3-a8ed-a57daaa15210\") " Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.658096 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-logs" (OuterVolumeSpecName: "logs") pod "239a5b5d-704f-43d3-a8ed-a57daaa15210" (UID: "239a5b5d-704f-43d3-a8ed-a57daaa15210"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.658100 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "239a5b5d-704f-43d3-a8ed-a57daaa15210" (UID: "239a5b5d-704f-43d3-a8ed-a57daaa15210"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.661400 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-scripts" (OuterVolumeSpecName: "scripts") pod "239a5b5d-704f-43d3-a8ed-a57daaa15210" (UID: "239a5b5d-704f-43d3-a8ed-a57daaa15210"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.661610 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239a5b5d-704f-43d3-a8ed-a57daaa15210-kube-api-access-x8kvn" (OuterVolumeSpecName: "kube-api-access-x8kvn") pod "239a5b5d-704f-43d3-a8ed-a57daaa15210" (UID: "239a5b5d-704f-43d3-a8ed-a57daaa15210"). InnerVolumeSpecName "kube-api-access-x8kvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.696681 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "239a5b5d-704f-43d3-a8ed-a57daaa15210" (UID: "239a5b5d-704f-43d3-a8ed-a57daaa15210"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.701020 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-config-data" (OuterVolumeSpecName: "config-data") pod "239a5b5d-704f-43d3-a8ed-a57daaa15210" (UID: "239a5b5d-704f-43d3-a8ed-a57daaa15210"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.706655 4990 generic.go:334] "Generic (PLEG): container finished" podID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerID="010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543" exitCode=0 Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.706685 4990 generic.go:334] "Generic (PLEG): container finished" podID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerID="2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0" exitCode=143 Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.706735 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"239a5b5d-704f-43d3-a8ed-a57daaa15210","Type":"ContainerDied","Data":"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543"} Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.706763 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"239a5b5d-704f-43d3-a8ed-a57daaa15210","Type":"ContainerDied","Data":"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0"} Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.706761 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.706783 4990 scope.go:117] "RemoveContainer" containerID="010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.706773 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"239a5b5d-704f-43d3-a8ed-a57daaa15210","Type":"ContainerDied","Data":"db19654e336c30680dfa52856a43a33d599f209d4086d73456a1bff1802ba833"} Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.712174 4990 generic.go:334] "Generic (PLEG): container finished" podID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerID="9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb" exitCode=0 Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.712252 4990 generic.go:334] "Generic (PLEG): container finished" podID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerID="dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca" exitCode=143 Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.712360 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a1711f7-5e0e-4054-a342-da21cc35bbf0","Type":"ContainerDied","Data":"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb"} Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.712396 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.712418 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a1711f7-5e0e-4054-a342-da21cc35bbf0","Type":"ContainerDied","Data":"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca"} Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.712430 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a1711f7-5e0e-4054-a342-da21cc35bbf0","Type":"ContainerDied","Data":"c2f887e4000d434d4e974635765247266807ec4e57041cf2741e6dc87f50275a"} Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.745224 4990 scope.go:117] "RemoveContainer" containerID="2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.753726 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.759138 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.759165 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.759175 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.759184 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239a5b5d-704f-43d3-a8ed-a57daaa15210-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.759196 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8kvn\" (UniqueName: \"kubernetes.io/projected/239a5b5d-704f-43d3-a8ed-a57daaa15210-kube-api-access-x8kvn\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.759204 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/239a5b5d-704f-43d3-a8ed-a57daaa15210-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.762582 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.773723 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.786560 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.792212 4990 scope.go:117] "RemoveContainer" containerID="010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543" Oct 03 11:20:24 crc kubenswrapper[4990]: E1003 11:20:24.792895 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543\": container with ID starting with 010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543 not found: ID does not exist" containerID="010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.792926 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543"} err="failed to get container status \"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543\": rpc error: code = NotFound desc = could not find container \"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543\": container with ID starting with 010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543 not found: ID does not exist" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.792946 4990 scope.go:117] "RemoveContainer" containerID="2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0" Oct 03 11:20:24 crc kubenswrapper[4990]: E1003 11:20:24.793613 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0\": container with ID starting with 2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0 not found: ID does not exist" containerID="2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.793640 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0"} err="failed to get container status \"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0\": rpc error: code = NotFound desc = could not find container \"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0\": container with ID starting with 2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0 not found: ID does not exist" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.793655 4990 scope.go:117] "RemoveContainer" containerID="010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.794976 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543"} err="failed to get container status \"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543\": rpc error: code = NotFound desc = could not find container \"010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543\": container with ID starting with 010339f5dbc5e25788cb7478b60dbcd4e07af43acf9a84057734b8374be44543 not found: ID does not exist" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.794995 4990 scope.go:117] "RemoveContainer" containerID="2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.795339 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0"} err="failed to get container status \"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0\": rpc error: code = NotFound desc = could not find container \"2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0\": container with ID starting with 2e6a6151fd11354bcfc28bdd3e67e9a0b2bd69852fe4edf9c1a2d9ed4da000d0 not found: ID does not exist" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.795359 4990 scope.go:117] "RemoveContainer" containerID="9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.800797 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:24 crc kubenswrapper[4990]: E1003 11:20:24.801196 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerName="glance-log" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.801211 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerName="glance-log" Oct 03 11:20:24 crc kubenswrapper[4990]: E1003 11:20:24.801227 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerName="glance-log" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.801233 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerName="glance-log" Oct 03 11:20:24 crc kubenswrapper[4990]: E1003 11:20:24.801256 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerName="glance-httpd" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.801263 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerName="glance-httpd" Oct 03 11:20:24 crc kubenswrapper[4990]: E1003 11:20:24.801280 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerName="glance-httpd" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.801286 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerName="glance-httpd" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.801442 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerName="glance-log" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.801459 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerName="glance-httpd" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.801467 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" containerName="glance-httpd" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.801475 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" containerName="glance-log" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.802410 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.808300 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.808380 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.808503 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-whwcz" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.808783 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.816949 4990 scope.go:117] "RemoveContainer" containerID="dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.826003 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.828338 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.832285 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.832907 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.846322 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.850030 4990 scope.go:117] "RemoveContainer" containerID="9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb" Oct 03 11:20:24 crc kubenswrapper[4990]: E1003 11:20:24.850546 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb\": container with ID starting with 9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb not found: ID does not exist" containerID="9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.850582 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb"} err="failed to get container status \"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb\": rpc error: code = NotFound desc = could not find container \"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb\": container with ID starting with 9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb not found: ID does not exist" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.850603 4990 scope.go:117] "RemoveContainer" containerID="dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca" Oct 03 11:20:24 crc kubenswrapper[4990]: E1003 11:20:24.851174 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca\": container with ID starting with dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca not found: ID does not exist" containerID="dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.851217 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca"} err="failed to get container status \"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca\": rpc error: code = NotFound desc = could not find container \"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca\": container with ID starting with dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca not found: ID does not exist" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.851265 4990 scope.go:117] "RemoveContainer" containerID="9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.851989 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb"} err="failed to get container status \"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb\": rpc error: code = NotFound desc = could not find container \"9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb\": container with ID starting with 9b8df95c583d0bd82b2f9cf35f3f4f228849eaeacc91c83f1d76a29ebb0af5cb not found: ID does not exist" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.852028 4990 scope.go:117] "RemoveContainer" containerID="dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.852540 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca"} err="failed to get container status \"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca\": rpc error: code = NotFound desc = could not find container \"dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca\": container with ID starting with dd68d947fd7185ad2dd2ceccaa5c6af670828482a8ae7fc32ff92db3568d4eca not found: ID does not exist" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.859167 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.890970 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239a5b5d-704f-43d3-a8ed-a57daaa15210" path="/var/lib/kubelet/pods/239a5b5d-704f-43d3-a8ed-a57daaa15210/volumes" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.891759 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1711f7-5e0e-4054-a342-da21cc35bbf0" path="/var/lib/kubelet/pods/2a1711f7-5e0e-4054-a342-da21cc35bbf0/volumes" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.971711 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87gbx\" (UniqueName: \"kubernetes.io/projected/56216570-ed46-42c2-98a7-c6986edd89e9-kube-api-access-87gbx\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.971770 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.971808 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.971829 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.971872 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.971904 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-scripts\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.971935 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.971964 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-logs\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.972014 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.972053 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-config-data\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.972076 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.972096 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.972130 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m6mj\" (UniqueName: \"kubernetes.io/projected/660ed9e6-0b82-4108-a6f4-d348b90db8d2-kube-api-access-8m6mj\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:24 crc kubenswrapper[4990]: I1003 11:20:24.972178 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073666 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-config-data\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073718 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073739 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073770 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m6mj\" (UniqueName: \"kubernetes.io/projected/660ed9e6-0b82-4108-a6f4-d348b90db8d2-kube-api-access-8m6mj\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073821 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073851 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87gbx\" (UniqueName: \"kubernetes.io/projected/56216570-ed46-42c2-98a7-c6986edd89e9-kube-api-access-87gbx\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073867 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073890 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073905 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073932 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073956 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-scripts\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.073981 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.074002 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-logs\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.074034 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.074241 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.074470 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.075044 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-logs\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.075252 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.077728 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.078004 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-config-data\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.078122 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-scripts\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.079612 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.079754 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.080129 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.080716 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.086175 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.099613 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87gbx\" (UniqueName: \"kubernetes.io/projected/56216570-ed46-42c2-98a7-c6986edd89e9-kube-api-access-87gbx\") pod \"glance-default-external-api-0\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.099915 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m6mj\" (UniqueName: \"kubernetes.io/projected/660ed9e6-0b82-4108-a6f4-d348b90db8d2-kube-api-access-8m6mj\") pod \"glance-default-internal-api-0\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.132874 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.157423 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.785118 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:20:25 crc kubenswrapper[4990]: W1003 11:20:25.791398 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660ed9e6_0b82_4108_a6f4_d348b90db8d2.slice/crio-ee932a488f47fdbe40824db0692961e9a86dec9dd1588b72b2229dae73773c94 WatchSource:0}: Error finding container ee932a488f47fdbe40824db0692961e9a86dec9dd1588b72b2229dae73773c94: Status 404 returned error can't find the container with id ee932a488f47fdbe40824db0692961e9a86dec9dd1588b72b2229dae73773c94 Oct 03 11:20:25 crc kubenswrapper[4990]: I1003 11:20:25.886038 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:20:25 crc kubenswrapper[4990]: W1003 11:20:25.896318 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56216570_ed46_42c2_98a7_c6986edd89e9.slice/crio-1e4bd7dc531c10851c16973aea7aa230d211b8816ddb461d40f2d08db4b2fda9 WatchSource:0}: Error finding container 1e4bd7dc531c10851c16973aea7aa230d211b8816ddb461d40f2d08db4b2fda9: Status 404 returned error can't find the container with id 1e4bd7dc531c10851c16973aea7aa230d211b8816ddb461d40f2d08db4b2fda9 Oct 03 11:20:26 crc kubenswrapper[4990]: I1003 11:20:26.738580 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56216570-ed46-42c2-98a7-c6986edd89e9","Type":"ContainerStarted","Data":"2cacbb7f9df592a55f01ff914f7723e441ad894fabfd848c9f8b1b934c47d48b"} Oct 03 11:20:26 crc kubenswrapper[4990]: I1003 11:20:26.738857 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56216570-ed46-42c2-98a7-c6986edd89e9","Type":"ContainerStarted","Data":"1e4bd7dc531c10851c16973aea7aa230d211b8816ddb461d40f2d08db4b2fda9"} Oct 03 11:20:26 crc kubenswrapper[4990]: I1003 11:20:26.742474 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660ed9e6-0b82-4108-a6f4-d348b90db8d2","Type":"ContainerStarted","Data":"7e546ca8c8fe3b8c85be0de3d8de112917e66d03189481086058603db75386ba"} Oct 03 11:20:26 crc kubenswrapper[4990]: I1003 11:20:26.742558 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660ed9e6-0b82-4108-a6f4-d348b90db8d2","Type":"ContainerStarted","Data":"ee932a488f47fdbe40824db0692961e9a86dec9dd1588b72b2229dae73773c94"} Oct 03 11:20:27 crc kubenswrapper[4990]: I1003 11:20:27.752976 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56216570-ed46-42c2-98a7-c6986edd89e9","Type":"ContainerStarted","Data":"2b654eb5a2db01992f65019e694c6843e211ff9263b940caffdb0e23a8d6e865"} Oct 03 11:20:27 crc kubenswrapper[4990]: I1003 11:20:27.756309 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660ed9e6-0b82-4108-a6f4-d348b90db8d2","Type":"ContainerStarted","Data":"2238d8b26788c9ac7065f48d5b95abea3b7acec0b17d161d16a98893ee2488f8"} Oct 03 11:20:27 crc kubenswrapper[4990]: I1003 11:20:27.789396 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.789377338 podStartE2EDuration="3.789377338s" podCreationTimestamp="2025-10-03 11:20:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:20:27.777556203 +0000 UTC m=+5809.574188090" watchObservedRunningTime="2025-10-03 11:20:27.789377338 +0000 UTC m=+5809.586009185" Oct 03 11:20:27 crc kubenswrapper[4990]: I1003 11:20:27.805291 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.8052723889999998 podStartE2EDuration="3.805272389s" podCreationTimestamp="2025-10-03 11:20:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:20:27.803162534 +0000 UTC m=+5809.599794391" watchObservedRunningTime="2025-10-03 11:20:27.805272389 +0000 UTC m=+5809.601904246" Oct 03 11:20:31 crc kubenswrapper[4990]: I1003 11:20:31.374706 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:20:31 crc kubenswrapper[4990]: I1003 11:20:31.465468 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ddb76bc4c-gczgf"] Oct 03 11:20:31 crc kubenswrapper[4990]: I1003 11:20:31.466031 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" podUID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" containerName="dnsmasq-dns" containerID="cri-o://f7445bc5a4ec5736c58bc494822424840d2afc5903fb6db097b2521ff9b8224c" gracePeriod=10 Oct 03 11:20:31 crc kubenswrapper[4990]: I1003 11:20:31.794703 4990 generic.go:334] "Generic (PLEG): container finished" podID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" containerID="f7445bc5a4ec5736c58bc494822424840d2afc5903fb6db097b2521ff9b8224c" exitCode=0 Oct 03 11:20:31 crc kubenswrapper[4990]: I1003 11:20:31.794905 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" event={"ID":"670a7ccb-4dec-44e3-aaea-53cb7b576bdb","Type":"ContainerDied","Data":"f7445bc5a4ec5736c58bc494822424840d2afc5903fb6db097b2521ff9b8224c"} Oct 03 11:20:31 crc kubenswrapper[4990]: I1003 11:20:31.974317 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.119230 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-config\") pod \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.119503 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pphql\" (UniqueName: \"kubernetes.io/projected/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-kube-api-access-pphql\") pod \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.119594 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-sb\") pod \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.119757 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-dns-svc\") pod \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.119828 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-nb\") pod \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\" (UID: \"670a7ccb-4dec-44e3-aaea-53cb7b576bdb\") " Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.126748 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-kube-api-access-pphql" (OuterVolumeSpecName: "kube-api-access-pphql") pod "670a7ccb-4dec-44e3-aaea-53cb7b576bdb" (UID: "670a7ccb-4dec-44e3-aaea-53cb7b576bdb"). InnerVolumeSpecName "kube-api-access-pphql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.171065 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "670a7ccb-4dec-44e3-aaea-53cb7b576bdb" (UID: "670a7ccb-4dec-44e3-aaea-53cb7b576bdb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.175807 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "670a7ccb-4dec-44e3-aaea-53cb7b576bdb" (UID: "670a7ccb-4dec-44e3-aaea-53cb7b576bdb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.187595 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "670a7ccb-4dec-44e3-aaea-53cb7b576bdb" (UID: "670a7ccb-4dec-44e3-aaea-53cb7b576bdb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.198008 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-config" (OuterVolumeSpecName: "config") pod "670a7ccb-4dec-44e3-aaea-53cb7b576bdb" (UID: "670a7ccb-4dec-44e3-aaea-53cb7b576bdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.222057 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pphql\" (UniqueName: \"kubernetes.io/projected/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-kube-api-access-pphql\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.222110 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.222131 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.222148 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.222161 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/670a7ccb-4dec-44e3-aaea-53cb7b576bdb-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.810069 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" event={"ID":"670a7ccb-4dec-44e3-aaea-53cb7b576bdb","Type":"ContainerDied","Data":"a58c822c4802e6ef1d0aba2ffed7b9fe22450d445e0a7ce628a99a636063af02"} Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.810499 4990 scope.go:117] "RemoveContainer" containerID="f7445bc5a4ec5736c58bc494822424840d2afc5903fb6db097b2521ff9b8224c" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.810112 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ddb76bc4c-gczgf" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.843608 4990 scope.go:117] "RemoveContainer" containerID="c230189993db54c1a98be05ee508a401ad3959b63b83568d61ba01e914c2d667" Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.861255 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ddb76bc4c-gczgf"] Oct 03 11:20:32 crc kubenswrapper[4990]: I1003 11:20:32.891479 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ddb76bc4c-gczgf"] Oct 03 11:20:34 crc kubenswrapper[4990]: I1003 11:20:34.885458 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" path="/var/lib/kubelet/pods/670a7ccb-4dec-44e3-aaea-53cb7b576bdb/volumes" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.134757 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.134845 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.158263 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.158333 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.176774 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.178584 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.187225 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.218100 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.841981 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.842051 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.842090 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 11:20:35 crc kubenswrapper[4990]: I1003 11:20:35.842108 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:36 crc kubenswrapper[4990]: I1003 11:20:36.871991 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:20:36 crc kubenswrapper[4990]: E1003 11:20:36.872621 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:20:37 crc kubenswrapper[4990]: I1003 11:20:37.791591 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:37 crc kubenswrapper[4990]: I1003 11:20:37.856700 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 11:20:37 crc kubenswrapper[4990]: I1003 11:20:37.909142 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 11:20:37 crc kubenswrapper[4990]: I1003 11:20:37.909427 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 11:20:37 crc kubenswrapper[4990]: I1003 11:20:37.909975 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 11:20:38 crc kubenswrapper[4990]: I1003 11:20:38.095588 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 11:20:45 crc kubenswrapper[4990]: I1003 11:20:45.741334 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kfp5z"] Oct 03 11:20:45 crc kubenswrapper[4990]: E1003 11:20:45.744312 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" containerName="dnsmasq-dns" Oct 03 11:20:45 crc kubenswrapper[4990]: I1003 11:20:45.744362 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" containerName="dnsmasq-dns" Oct 03 11:20:45 crc kubenswrapper[4990]: E1003 11:20:45.744390 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" containerName="init" Oct 03 11:20:45 crc kubenswrapper[4990]: I1003 11:20:45.744398 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" containerName="init" Oct 03 11:20:45 crc kubenswrapper[4990]: I1003 11:20:45.746482 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="670a7ccb-4dec-44e3-aaea-53cb7b576bdb" containerName="dnsmasq-dns" Oct 03 11:20:45 crc kubenswrapper[4990]: I1003 11:20:45.762420 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kfp5z" Oct 03 11:20:45 crc kubenswrapper[4990]: I1003 11:20:45.777498 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kfp5z"] Oct 03 11:20:45 crc kubenswrapper[4990]: I1003 11:20:45.889753 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh6lt\" (UniqueName: \"kubernetes.io/projected/f624aa60-521f-442b-a15d-f7eb21df31e7-kube-api-access-lh6lt\") pod \"placement-db-create-kfp5z\" (UID: \"f624aa60-521f-442b-a15d-f7eb21df31e7\") " pod="openstack/placement-db-create-kfp5z" Oct 03 11:20:45 crc kubenswrapper[4990]: I1003 11:20:45.992020 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh6lt\" (UniqueName: \"kubernetes.io/projected/f624aa60-521f-442b-a15d-f7eb21df31e7-kube-api-access-lh6lt\") pod \"placement-db-create-kfp5z\" (UID: \"f624aa60-521f-442b-a15d-f7eb21df31e7\") " pod="openstack/placement-db-create-kfp5z" Oct 03 11:20:46 crc kubenswrapper[4990]: I1003 11:20:46.013934 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh6lt\" (UniqueName: \"kubernetes.io/projected/f624aa60-521f-442b-a15d-f7eb21df31e7-kube-api-access-lh6lt\") pod \"placement-db-create-kfp5z\" (UID: \"f624aa60-521f-442b-a15d-f7eb21df31e7\") " pod="openstack/placement-db-create-kfp5z" Oct 03 11:20:46 crc kubenswrapper[4990]: I1003 11:20:46.093092 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kfp5z" Oct 03 11:20:46 crc kubenswrapper[4990]: I1003 11:20:46.573277 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kfp5z"] Oct 03 11:20:46 crc kubenswrapper[4990]: I1003 11:20:46.936656 4990 generic.go:334] "Generic (PLEG): container finished" podID="f624aa60-521f-442b-a15d-f7eb21df31e7" containerID="2c231615a596a195069ee92680a0d4521e42d4e4a241234cf57b0ccdfe878fcc" exitCode=0 Oct 03 11:20:46 crc kubenswrapper[4990]: I1003 11:20:46.937260 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kfp5z" event={"ID":"f624aa60-521f-442b-a15d-f7eb21df31e7","Type":"ContainerDied","Data":"2c231615a596a195069ee92680a0d4521e42d4e4a241234cf57b0ccdfe878fcc"} Oct 03 11:20:46 crc kubenswrapper[4990]: I1003 11:20:46.937284 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kfp5z" event={"ID":"f624aa60-521f-442b-a15d-f7eb21df31e7","Type":"ContainerStarted","Data":"7ebf59d98db899347d1890faaeb5edb2f02ab8d12524f910c1a6bf2299bc792a"} Oct 03 11:20:48 crc kubenswrapper[4990]: I1003 11:20:48.304564 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kfp5z" Oct 03 11:20:48 crc kubenswrapper[4990]: I1003 11:20:48.438402 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh6lt\" (UniqueName: \"kubernetes.io/projected/f624aa60-521f-442b-a15d-f7eb21df31e7-kube-api-access-lh6lt\") pod \"f624aa60-521f-442b-a15d-f7eb21df31e7\" (UID: \"f624aa60-521f-442b-a15d-f7eb21df31e7\") " Oct 03 11:20:48 crc kubenswrapper[4990]: I1003 11:20:48.444841 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f624aa60-521f-442b-a15d-f7eb21df31e7-kube-api-access-lh6lt" (OuterVolumeSpecName: "kube-api-access-lh6lt") pod "f624aa60-521f-442b-a15d-f7eb21df31e7" (UID: "f624aa60-521f-442b-a15d-f7eb21df31e7"). InnerVolumeSpecName "kube-api-access-lh6lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:20:48 crc kubenswrapper[4990]: I1003 11:20:48.541472 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh6lt\" (UniqueName: \"kubernetes.io/projected/f624aa60-521f-442b-a15d-f7eb21df31e7-kube-api-access-lh6lt\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:48 crc kubenswrapper[4990]: I1003 11:20:48.955961 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kfp5z" event={"ID":"f624aa60-521f-442b-a15d-f7eb21df31e7","Type":"ContainerDied","Data":"7ebf59d98db899347d1890faaeb5edb2f02ab8d12524f910c1a6bf2299bc792a"} Oct 03 11:20:48 crc kubenswrapper[4990]: I1003 11:20:48.956009 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ebf59d98db899347d1890faaeb5edb2f02ab8d12524f910c1a6bf2299bc792a" Oct 03 11:20:48 crc kubenswrapper[4990]: I1003 11:20:48.956018 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kfp5z" Oct 03 11:20:49 crc kubenswrapper[4990]: I1003 11:20:49.871975 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:20:49 crc kubenswrapper[4990]: E1003 11:20:49.872697 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:20:55 crc kubenswrapper[4990]: I1003 11:20:55.759660 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fe13-account-create-59jt8"] Oct 03 11:20:55 crc kubenswrapper[4990]: E1003 11:20:55.760764 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f624aa60-521f-442b-a15d-f7eb21df31e7" containerName="mariadb-database-create" Oct 03 11:20:55 crc kubenswrapper[4990]: I1003 11:20:55.760782 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f624aa60-521f-442b-a15d-f7eb21df31e7" containerName="mariadb-database-create" Oct 03 11:20:55 crc kubenswrapper[4990]: I1003 11:20:55.761052 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f624aa60-521f-442b-a15d-f7eb21df31e7" containerName="mariadb-database-create" Oct 03 11:20:55 crc kubenswrapper[4990]: I1003 11:20:55.761872 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fe13-account-create-59jt8" Oct 03 11:20:55 crc kubenswrapper[4990]: I1003 11:20:55.764454 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 11:20:55 crc kubenswrapper[4990]: I1003 11:20:55.798439 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fe13-account-create-59jt8"] Oct 03 11:20:55 crc kubenswrapper[4990]: I1003 11:20:55.905123 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwkvm\" (UniqueName: \"kubernetes.io/projected/00497aea-4a18-4550-aea4-94c333754563-kube-api-access-xwkvm\") pod \"placement-fe13-account-create-59jt8\" (UID: \"00497aea-4a18-4550-aea4-94c333754563\") " pod="openstack/placement-fe13-account-create-59jt8" Oct 03 11:20:56 crc kubenswrapper[4990]: I1003 11:20:56.006611 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwkvm\" (UniqueName: \"kubernetes.io/projected/00497aea-4a18-4550-aea4-94c333754563-kube-api-access-xwkvm\") pod \"placement-fe13-account-create-59jt8\" (UID: \"00497aea-4a18-4550-aea4-94c333754563\") " pod="openstack/placement-fe13-account-create-59jt8" Oct 03 11:20:56 crc kubenswrapper[4990]: I1003 11:20:56.026896 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwkvm\" (UniqueName: \"kubernetes.io/projected/00497aea-4a18-4550-aea4-94c333754563-kube-api-access-xwkvm\") pod \"placement-fe13-account-create-59jt8\" (UID: \"00497aea-4a18-4550-aea4-94c333754563\") " pod="openstack/placement-fe13-account-create-59jt8" Oct 03 11:20:56 crc kubenswrapper[4990]: I1003 11:20:56.113341 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fe13-account-create-59jt8" Oct 03 11:20:56 crc kubenswrapper[4990]: I1003 11:20:56.376801 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fe13-account-create-59jt8"] Oct 03 11:20:56 crc kubenswrapper[4990]: W1003 11:20:56.385808 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00497aea_4a18_4550_aea4_94c333754563.slice/crio-a7dc93ec3341d9a59da0357ca01e9a7adfe9ee56ac344d1db3511ff4020ca643 WatchSource:0}: Error finding container a7dc93ec3341d9a59da0357ca01e9a7adfe9ee56ac344d1db3511ff4020ca643: Status 404 returned error can't find the container with id a7dc93ec3341d9a59da0357ca01e9a7adfe9ee56ac344d1db3511ff4020ca643 Oct 03 11:20:57 crc kubenswrapper[4990]: I1003 11:20:57.034344 4990 generic.go:334] "Generic (PLEG): container finished" podID="00497aea-4a18-4550-aea4-94c333754563" containerID="1c2ddb694a307fec69c562e63d4a90372dbd043d780e0efa2ea883d8d9052174" exitCode=0 Oct 03 11:20:57 crc kubenswrapper[4990]: I1003 11:20:57.034399 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fe13-account-create-59jt8" event={"ID":"00497aea-4a18-4550-aea4-94c333754563","Type":"ContainerDied","Data":"1c2ddb694a307fec69c562e63d4a90372dbd043d780e0efa2ea883d8d9052174"} Oct 03 11:20:57 crc kubenswrapper[4990]: I1003 11:20:57.034430 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fe13-account-create-59jt8" event={"ID":"00497aea-4a18-4550-aea4-94c333754563","Type":"ContainerStarted","Data":"a7dc93ec3341d9a59da0357ca01e9a7adfe9ee56ac344d1db3511ff4020ca643"} Oct 03 11:20:58 crc kubenswrapper[4990]: I1003 11:20:58.348686 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fe13-account-create-59jt8" Oct 03 11:20:58 crc kubenswrapper[4990]: I1003 11:20:58.453907 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwkvm\" (UniqueName: \"kubernetes.io/projected/00497aea-4a18-4550-aea4-94c333754563-kube-api-access-xwkvm\") pod \"00497aea-4a18-4550-aea4-94c333754563\" (UID: \"00497aea-4a18-4550-aea4-94c333754563\") " Oct 03 11:20:58 crc kubenswrapper[4990]: I1003 11:20:58.462608 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00497aea-4a18-4550-aea4-94c333754563-kube-api-access-xwkvm" (OuterVolumeSpecName: "kube-api-access-xwkvm") pod "00497aea-4a18-4550-aea4-94c333754563" (UID: "00497aea-4a18-4550-aea4-94c333754563"). InnerVolumeSpecName "kube-api-access-xwkvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:20:58 crc kubenswrapper[4990]: I1003 11:20:58.555908 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwkvm\" (UniqueName: \"kubernetes.io/projected/00497aea-4a18-4550-aea4-94c333754563-kube-api-access-xwkvm\") on node \"crc\" DevicePath \"\"" Oct 03 11:20:59 crc kubenswrapper[4990]: I1003 11:20:59.058247 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fe13-account-create-59jt8" event={"ID":"00497aea-4a18-4550-aea4-94c333754563","Type":"ContainerDied","Data":"a7dc93ec3341d9a59da0357ca01e9a7adfe9ee56ac344d1db3511ff4020ca643"} Oct 03 11:20:59 crc kubenswrapper[4990]: I1003 11:20:59.058287 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7dc93ec3341d9a59da0357ca01e9a7adfe9ee56ac344d1db3511ff4020ca643" Oct 03 11:20:59 crc kubenswrapper[4990]: I1003 11:20:59.058348 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fe13-account-create-59jt8" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.006995 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b665ddf7-pgx56"] Oct 03 11:21:01 crc kubenswrapper[4990]: E1003 11:21:01.007755 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00497aea-4a18-4550-aea4-94c333754563" containerName="mariadb-account-create" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.007772 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="00497aea-4a18-4550-aea4-94c333754563" containerName="mariadb-account-create" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.007978 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="00497aea-4a18-4550-aea4-94c333754563" containerName="mariadb-account-create" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.009055 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.018740 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b665ddf7-pgx56"] Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.056121 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mdqzf"] Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.057424 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.062979 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.063025 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.063126 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wns6z" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.069816 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mdqzf"] Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.102568 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-sb\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.102732 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-nb\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.102772 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-config\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.102907 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-dns-svc\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.103088 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fkv\" (UniqueName: \"kubernetes.io/projected/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-kube-api-access-79fkv\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.205365 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-combined-ca-bundle\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.205450 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-scripts\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.205601 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-nb\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.205673 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-config\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.205759 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-dns-svc\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.205833 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcmsl\" (UniqueName: \"kubernetes.io/projected/e7ec54c8-f8df-4d83-921b-92df7a464ab0-kube-api-access-fcmsl\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.205905 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7ec54c8-f8df-4d83-921b-92df7a464ab0-logs\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.205978 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fkv\" (UniqueName: \"kubernetes.io/projected/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-kube-api-access-79fkv\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.206102 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-config-data\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.206142 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-sb\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.207037 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-dns-svc\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.207095 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-nb\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.207095 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-config\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.207130 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-sb\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.223225 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fkv\" (UniqueName: \"kubernetes.io/projected/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-kube-api-access-79fkv\") pod \"dnsmasq-dns-b665ddf7-pgx56\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.310918 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcmsl\" (UniqueName: \"kubernetes.io/projected/e7ec54c8-f8df-4d83-921b-92df7a464ab0-kube-api-access-fcmsl\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.310985 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7ec54c8-f8df-4d83-921b-92df7a464ab0-logs\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.311048 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-config-data\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.311107 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-combined-ca-bundle\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.311155 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-scripts\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.311898 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7ec54c8-f8df-4d83-921b-92df7a464ab0-logs\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.315726 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-scripts\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.316054 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-config-data\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.318169 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-combined-ca-bundle\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.328926 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcmsl\" (UniqueName: \"kubernetes.io/projected/e7ec54c8-f8df-4d83-921b-92df7a464ab0-kube-api-access-fcmsl\") pod \"placement-db-sync-mdqzf\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.339095 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.389015 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.868811 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b665ddf7-pgx56"] Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.871682 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:21:01 crc kubenswrapper[4990]: E1003 11:21:01.871931 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:21:01 crc kubenswrapper[4990]: I1003 11:21:01.972332 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mdqzf"] Oct 03 11:21:02 crc kubenswrapper[4990]: I1003 11:21:02.090562 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" event={"ID":"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4","Type":"ContainerStarted","Data":"58ab21ca6e002bbc8bc0418152fe826d5f6645e778d95b00151af77c37d63aa3"} Oct 03 11:21:02 crc kubenswrapper[4990]: I1003 11:21:02.090909 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" event={"ID":"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4","Type":"ContainerStarted","Data":"194dd50dc27e9bee9b38d644372246d1e37fbed958fa48cebd779e2e952de013"} Oct 03 11:21:02 crc kubenswrapper[4990]: I1003 11:21:02.093257 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mdqzf" event={"ID":"e7ec54c8-f8df-4d83-921b-92df7a464ab0","Type":"ContainerStarted","Data":"84bcc6c91fc64c81df197e19fb00bfd550f8bb0f7882a153d227ade61925b9f3"} Oct 03 11:21:03 crc kubenswrapper[4990]: I1003 11:21:03.103418 4990 generic.go:334] "Generic (PLEG): container finished" podID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" containerID="58ab21ca6e002bbc8bc0418152fe826d5f6645e778d95b00151af77c37d63aa3" exitCode=0 Oct 03 11:21:03 crc kubenswrapper[4990]: I1003 11:21:03.103500 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" event={"ID":"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4","Type":"ContainerDied","Data":"58ab21ca6e002bbc8bc0418152fe826d5f6645e778d95b00151af77c37d63aa3"} Oct 03 11:21:03 crc kubenswrapper[4990]: I1003 11:21:03.106574 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mdqzf" event={"ID":"e7ec54c8-f8df-4d83-921b-92df7a464ab0","Type":"ContainerStarted","Data":"bb0224101efd5cd61ffb912e6a4d75ff6ae4bff946a7391a2939c8cda43f41c3"} Oct 03 11:21:03 crc kubenswrapper[4990]: I1003 11:21:03.170896 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mdqzf" podStartSLOduration=2.170861184 podStartE2EDuration="2.170861184s" podCreationTimestamp="2025-10-03 11:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:21:03.159797768 +0000 UTC m=+5844.956429645" watchObservedRunningTime="2025-10-03 11:21:03.170861184 +0000 UTC m=+5844.967493041" Oct 03 11:21:04 crc kubenswrapper[4990]: I1003 11:21:04.118093 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" event={"ID":"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4","Type":"ContainerStarted","Data":"66fa367ce4241e874fc3d82bb892eb52cb9b9d7ee0de59fa2b264d09a3b1f742"} Oct 03 11:21:04 crc kubenswrapper[4990]: I1003 11:21:04.118502 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:04 crc kubenswrapper[4990]: I1003 11:21:04.120091 4990 generic.go:334] "Generic (PLEG): container finished" podID="e7ec54c8-f8df-4d83-921b-92df7a464ab0" containerID="bb0224101efd5cd61ffb912e6a4d75ff6ae4bff946a7391a2939c8cda43f41c3" exitCode=0 Oct 03 11:21:04 crc kubenswrapper[4990]: I1003 11:21:04.120133 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mdqzf" event={"ID":"e7ec54c8-f8df-4d83-921b-92df7a464ab0","Type":"ContainerDied","Data":"bb0224101efd5cd61ffb912e6a4d75ff6ae4bff946a7391a2939c8cda43f41c3"} Oct 03 11:21:04 crc kubenswrapper[4990]: I1003 11:21:04.139342 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" podStartSLOduration=4.139321241 podStartE2EDuration="4.139321241s" podCreationTimestamp="2025-10-03 11:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:21:04.135471632 +0000 UTC m=+5845.932103489" watchObservedRunningTime="2025-10-03 11:21:04.139321241 +0000 UTC m=+5845.935953098" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.531374 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.700147 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcmsl\" (UniqueName: \"kubernetes.io/projected/e7ec54c8-f8df-4d83-921b-92df7a464ab0-kube-api-access-fcmsl\") pod \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.700597 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-combined-ca-bundle\") pod \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.700722 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-scripts\") pod \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.700855 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7ec54c8-f8df-4d83-921b-92df7a464ab0-logs\") pod \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.701603 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-config-data\") pod \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\" (UID: \"e7ec54c8-f8df-4d83-921b-92df7a464ab0\") " Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.701132 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7ec54c8-f8df-4d83-921b-92df7a464ab0-logs" (OuterVolumeSpecName: "logs") pod "e7ec54c8-f8df-4d83-921b-92df7a464ab0" (UID: "e7ec54c8-f8df-4d83-921b-92df7a464ab0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.702419 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7ec54c8-f8df-4d83-921b-92df7a464ab0-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.708971 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ec54c8-f8df-4d83-921b-92df7a464ab0-kube-api-access-fcmsl" (OuterVolumeSpecName: "kube-api-access-fcmsl") pod "e7ec54c8-f8df-4d83-921b-92df7a464ab0" (UID: "e7ec54c8-f8df-4d83-921b-92df7a464ab0"). InnerVolumeSpecName "kube-api-access-fcmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.709093 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-scripts" (OuterVolumeSpecName: "scripts") pod "e7ec54c8-f8df-4d83-921b-92df7a464ab0" (UID: "e7ec54c8-f8df-4d83-921b-92df7a464ab0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.733609 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-config-data" (OuterVolumeSpecName: "config-data") pod "e7ec54c8-f8df-4d83-921b-92df7a464ab0" (UID: "e7ec54c8-f8df-4d83-921b-92df7a464ab0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.756364 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7ec54c8-f8df-4d83-921b-92df7a464ab0" (UID: "e7ec54c8-f8df-4d83-921b-92df7a464ab0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.804657 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.804721 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcmsl\" (UniqueName: \"kubernetes.io/projected/e7ec54c8-f8df-4d83-921b-92df7a464ab0-kube-api-access-fcmsl\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.804749 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:05 crc kubenswrapper[4990]: I1003 11:21:05.804774 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7ec54c8-f8df-4d83-921b-92df7a464ab0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.140769 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mdqzf" event={"ID":"e7ec54c8-f8df-4d83-921b-92df7a464ab0","Type":"ContainerDied","Data":"84bcc6c91fc64c81df197e19fb00bfd550f8bb0f7882a153d227ade61925b9f3"} Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.141100 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84bcc6c91fc64c81df197e19fb00bfd550f8bb0f7882a153d227ade61925b9f3" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.140882 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mdqzf" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.627414 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-575f56f894-g5gdh"] Oct 03 11:21:06 crc kubenswrapper[4990]: E1003 11:21:06.627965 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ec54c8-f8df-4d83-921b-92df7a464ab0" containerName="placement-db-sync" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.627982 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ec54c8-f8df-4d83-921b-92df7a464ab0" containerName="placement-db-sync" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.628231 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ec54c8-f8df-4d83-921b-92df7a464ab0" containerName="placement-db-sync" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.629405 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.632249 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.633082 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.633308 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.634140 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.634297 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wns6z" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.647410 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-575f56f894-g5gdh"] Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.821006 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-scripts\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.821074 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-combined-ca-bundle\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.821148 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-logs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.821173 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7n5z\" (UniqueName: \"kubernetes.io/projected/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-kube-api-access-v7n5z\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.821218 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-config-data\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.821236 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-public-tls-certs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.821253 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-internal-tls-certs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.923129 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-logs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.923190 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7n5z\" (UniqueName: \"kubernetes.io/projected/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-kube-api-access-v7n5z\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.923256 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-config-data\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.923490 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-public-tls-certs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.923656 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-logs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.923985 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-internal-tls-certs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.924025 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-scripts\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.924090 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-combined-ca-bundle\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.927580 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-scripts\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.928315 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-internal-tls-certs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.930395 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-config-data\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.932029 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-public-tls-certs\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.935680 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-combined-ca-bundle\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.940446 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7n5z\" (UniqueName: \"kubernetes.io/projected/fce8c270-053f-4b51-ab1d-6f92dabb2ed2-kube-api-access-v7n5z\") pod \"placement-575f56f894-g5gdh\" (UID: \"fce8c270-053f-4b51-ab1d-6f92dabb2ed2\") " pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:06 crc kubenswrapper[4990]: I1003 11:21:06.959979 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:07 crc kubenswrapper[4990]: I1003 11:21:07.467094 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-575f56f894-g5gdh"] Oct 03 11:21:08 crc kubenswrapper[4990]: I1003 11:21:08.168265 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575f56f894-g5gdh" event={"ID":"fce8c270-053f-4b51-ab1d-6f92dabb2ed2","Type":"ContainerStarted","Data":"01507b43c2d861e55824dafd31b8cd8e58ebc4e0549e442f8b4d2b48d8cc4cfc"} Oct 03 11:21:08 crc kubenswrapper[4990]: I1003 11:21:08.168689 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:08 crc kubenswrapper[4990]: I1003 11:21:08.168706 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575f56f894-g5gdh" event={"ID":"fce8c270-053f-4b51-ab1d-6f92dabb2ed2","Type":"ContainerStarted","Data":"48b9f546b28d47403dc5db94b4f4feb8dc50f99d676159c79dc9d89a4beeea17"} Oct 03 11:21:08 crc kubenswrapper[4990]: I1003 11:21:08.168718 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575f56f894-g5gdh" event={"ID":"fce8c270-053f-4b51-ab1d-6f92dabb2ed2","Type":"ContainerStarted","Data":"ee4ca91f0d7285724bb2ecf0928b7ba6f8cc6b53c542dcc3b3bba08fb951827b"} Oct 03 11:21:08 crc kubenswrapper[4990]: I1003 11:21:08.168732 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:08 crc kubenswrapper[4990]: I1003 11:21:08.193603 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-575f56f894-g5gdh" podStartSLOduration=2.193564762 podStartE2EDuration="2.193564762s" podCreationTimestamp="2025-10-03 11:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:21:08.19001892 +0000 UTC m=+5849.986650797" watchObservedRunningTime="2025-10-03 11:21:08.193564762 +0000 UTC m=+5849.990196629" Oct 03 11:21:11 crc kubenswrapper[4990]: I1003 11:21:11.340756 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:21:11 crc kubenswrapper[4990]: I1003 11:21:11.420577 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bb849867-hs62h"] Oct 03 11:21:11 crc kubenswrapper[4990]: I1003 11:21:11.420906 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bb849867-hs62h" podUID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" containerName="dnsmasq-dns" containerID="cri-o://e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244" gracePeriod=10 Oct 03 11:21:11 crc kubenswrapper[4990]: I1003 11:21:11.915756 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.040456 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-dns-svc\") pod \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.040604 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-sb\") pod \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.040711 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-nb\") pod \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.040746 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-config\") pod \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.041448 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjd49\" (UniqueName: \"kubernetes.io/projected/1f206a0b-569e-4d77-ac0b-f7a41672b39c-kube-api-access-gjd49\") pod \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\" (UID: \"1f206a0b-569e-4d77-ac0b-f7a41672b39c\") " Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.064733 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f206a0b-569e-4d77-ac0b-f7a41672b39c-kube-api-access-gjd49" (OuterVolumeSpecName: "kube-api-access-gjd49") pod "1f206a0b-569e-4d77-ac0b-f7a41672b39c" (UID: "1f206a0b-569e-4d77-ac0b-f7a41672b39c"). InnerVolumeSpecName "kube-api-access-gjd49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.126391 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f206a0b-569e-4d77-ac0b-f7a41672b39c" (UID: "1f206a0b-569e-4d77-ac0b-f7a41672b39c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.131028 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f206a0b-569e-4d77-ac0b-f7a41672b39c" (UID: "1f206a0b-569e-4d77-ac0b-f7a41672b39c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.143420 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.143448 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.143458 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjd49\" (UniqueName: \"kubernetes.io/projected/1f206a0b-569e-4d77-ac0b-f7a41672b39c-kube-api-access-gjd49\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.156876 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f206a0b-569e-4d77-ac0b-f7a41672b39c" (UID: "1f206a0b-569e-4d77-ac0b-f7a41672b39c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.169291 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-config" (OuterVolumeSpecName: "config") pod "1f206a0b-569e-4d77-ac0b-f7a41672b39c" (UID: "1f206a0b-569e-4d77-ac0b-f7a41672b39c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.203907 4990 generic.go:334] "Generic (PLEG): container finished" podID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" containerID="e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244" exitCode=0 Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.203975 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb849867-hs62h" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.204015 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb849867-hs62h" event={"ID":"1f206a0b-569e-4d77-ac0b-f7a41672b39c","Type":"ContainerDied","Data":"e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244"} Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.204303 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb849867-hs62h" event={"ID":"1f206a0b-569e-4d77-ac0b-f7a41672b39c","Type":"ContainerDied","Data":"dc7f405fa2b173085c59589ee19a97c6ea0b45ca47d5072f3aea4e4d57f8b1ae"} Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.204324 4990 scope.go:117] "RemoveContainer" containerID="e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.228202 4990 scope.go:117] "RemoveContainer" containerID="d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.234552 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bb849867-hs62h"] Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.241637 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bb849867-hs62h"] Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.245058 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.245092 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f206a0b-569e-4d77-ac0b-f7a41672b39c-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.254613 4990 scope.go:117] "RemoveContainer" containerID="e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244" Oct 03 11:21:12 crc kubenswrapper[4990]: E1003 11:21:12.255093 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244\": container with ID starting with e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244 not found: ID does not exist" containerID="e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.255132 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244"} err="failed to get container status \"e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244\": rpc error: code = NotFound desc = could not find container \"e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244\": container with ID starting with e2f9431892b1ee1101802f2264c934f257ba68dcc1ff162e87f3e42c75b6c244 not found: ID does not exist" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.255152 4990 scope.go:117] "RemoveContainer" containerID="d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef" Oct 03 11:21:12 crc kubenswrapper[4990]: E1003 11:21:12.255369 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef\": container with ID starting with d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef not found: ID does not exist" containerID="d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.255395 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef"} err="failed to get container status \"d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef\": rpc error: code = NotFound desc = could not find container \"d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef\": container with ID starting with d968bc16c382f231795009299041ead708cbf8de8aeaec1f22ff70a00ec726ef not found: ID does not exist" Oct 03 11:21:12 crc kubenswrapper[4990]: I1003 11:21:12.884913 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" path="/var/lib/kubelet/pods/1f206a0b-569e-4d77-ac0b-f7a41672b39c/volumes" Oct 03 11:21:16 crc kubenswrapper[4990]: I1003 11:21:16.872970 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:21:16 crc kubenswrapper[4990]: E1003 11:21:16.873692 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:21:29 crc kubenswrapper[4990]: I1003 11:21:29.871917 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:21:29 crc kubenswrapper[4990]: E1003 11:21:29.872760 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:21:37 crc kubenswrapper[4990]: I1003 11:21:37.975055 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:38 crc kubenswrapper[4990]: I1003 11:21:38.026854 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-575f56f894-g5gdh" Oct 03 11:21:44 crc kubenswrapper[4990]: I1003 11:21:44.872912 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:21:44 crc kubenswrapper[4990]: E1003 11:21:44.874034 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:21:48 crc kubenswrapper[4990]: I1003 11:21:48.948395 4990 scope.go:117] "RemoveContainer" containerID="546d73db120b08f4ba62e9d0ceb77f013d502b1e81001f8701db0d7f74aaa7dd" Oct 03 11:21:49 crc kubenswrapper[4990]: I1003 11:21:49.009099 4990 scope.go:117] "RemoveContainer" containerID="62be4a19c4dc7cb13d064b89cb5a7b0d7aca046c5c72b58d721cf64d9f3a4473" Oct 03 11:21:49 crc kubenswrapper[4990]: I1003 11:21:49.072771 4990 scope.go:117] "RemoveContainer" containerID="ce9973279e392bf1d3264abf9bd2d1823d104168f3143812f537fd24b70ee2e0" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.681651 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7m2pb"] Oct 03 11:21:57 crc kubenswrapper[4990]: E1003 11:21:57.682728 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" containerName="dnsmasq-dns" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.682744 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" containerName="dnsmasq-dns" Oct 03 11:21:57 crc kubenswrapper[4990]: E1003 11:21:57.682769 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" containerName="init" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.682781 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" containerName="init" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.683012 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f206a0b-569e-4d77-ac0b-f7a41672b39c" containerName="dnsmasq-dns" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.683807 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7m2pb" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.688776 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7m2pb"] Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.734211 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmzv7\" (UniqueName: \"kubernetes.io/projected/dff835e0-d422-414c-a731-49acae82b765-kube-api-access-pmzv7\") pod \"nova-api-db-create-7m2pb\" (UID: \"dff835e0-d422-414c-a731-49acae82b765\") " pod="openstack/nova-api-db-create-7m2pb" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.750193 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jwwc2"] Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.752125 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jwwc2" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.757404 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jwwc2"] Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.836650 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmzv7\" (UniqueName: \"kubernetes.io/projected/dff835e0-d422-414c-a731-49acae82b765-kube-api-access-pmzv7\") pod \"nova-api-db-create-7m2pb\" (UID: \"dff835e0-d422-414c-a731-49acae82b765\") " pod="openstack/nova-api-db-create-7m2pb" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.836719 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9nf\" (UniqueName: \"kubernetes.io/projected/d748b646-6b70-42c8-8bc5-f8ef381f4eab-kube-api-access-sp9nf\") pod \"nova-cell0-db-create-jwwc2\" (UID: \"d748b646-6b70-42c8-8bc5-f8ef381f4eab\") " pod="openstack/nova-cell0-db-create-jwwc2" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.853746 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmzv7\" (UniqueName: \"kubernetes.io/projected/dff835e0-d422-414c-a731-49acae82b765-kube-api-access-pmzv7\") pod \"nova-api-db-create-7m2pb\" (UID: \"dff835e0-d422-414c-a731-49acae82b765\") " pod="openstack/nova-api-db-create-7m2pb" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.938587 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9nf\" (UniqueName: \"kubernetes.io/projected/d748b646-6b70-42c8-8bc5-f8ef381f4eab-kube-api-access-sp9nf\") pod \"nova-cell0-db-create-jwwc2\" (UID: \"d748b646-6b70-42c8-8bc5-f8ef381f4eab\") " pod="openstack/nova-cell0-db-create-jwwc2" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.946619 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-cd6sv"] Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.947978 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cd6sv" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.962609 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9nf\" (UniqueName: \"kubernetes.io/projected/d748b646-6b70-42c8-8bc5-f8ef381f4eab-kube-api-access-sp9nf\") pod \"nova-cell0-db-create-jwwc2\" (UID: \"d748b646-6b70-42c8-8bc5-f8ef381f4eab\") " pod="openstack/nova-cell0-db-create-jwwc2" Oct 03 11:21:57 crc kubenswrapper[4990]: I1003 11:21:57.968965 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cd6sv"] Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.029300 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7m2pb" Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.068539 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jwwc2" Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.142559 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8k64\" (UniqueName: \"kubernetes.io/projected/c94f2641-c80a-4273-9b61-2954040a76fb-kube-api-access-h8k64\") pod \"nova-cell1-db-create-cd6sv\" (UID: \"c94f2641-c80a-4273-9b61-2954040a76fb\") " pod="openstack/nova-cell1-db-create-cd6sv" Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.244028 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8k64\" (UniqueName: \"kubernetes.io/projected/c94f2641-c80a-4273-9b61-2954040a76fb-kube-api-access-h8k64\") pod \"nova-cell1-db-create-cd6sv\" (UID: \"c94f2641-c80a-4273-9b61-2954040a76fb\") " pod="openstack/nova-cell1-db-create-cd6sv" Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.265155 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8k64\" (UniqueName: \"kubernetes.io/projected/c94f2641-c80a-4273-9b61-2954040a76fb-kube-api-access-h8k64\") pod \"nova-cell1-db-create-cd6sv\" (UID: \"c94f2641-c80a-4273-9b61-2954040a76fb\") " pod="openstack/nova-cell1-db-create-cd6sv" Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.297870 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cd6sv" Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.508306 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7m2pb"] Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.590414 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jwwc2"] Oct 03 11:21:58 crc kubenswrapper[4990]: W1003 11:21:58.595057 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd748b646_6b70_42c8_8bc5_f8ef381f4eab.slice/crio-300f853e39f446795e14c3b9eeab6b09e6200e5300709d3e2469dd5ec94a7c26 WatchSource:0}: Error finding container 300f853e39f446795e14c3b9eeab6b09e6200e5300709d3e2469dd5ec94a7c26: Status 404 returned error can't find the container with id 300f853e39f446795e14c3b9eeab6b09e6200e5300709d3e2469dd5ec94a7c26 Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.653256 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7m2pb" event={"ID":"dff835e0-d422-414c-a731-49acae82b765","Type":"ContainerStarted","Data":"a62790ff5216163704e36c8d6162f06e0e170189d14ee9fa88251970299b5cb0"} Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.660798 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jwwc2" event={"ID":"d748b646-6b70-42c8-8bc5-f8ef381f4eab","Type":"ContainerStarted","Data":"300f853e39f446795e14c3b9eeab6b09e6200e5300709d3e2469dd5ec94a7c26"} Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.739607 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cd6sv"] Oct 03 11:21:58 crc kubenswrapper[4990]: W1003 11:21:58.823721 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc94f2641_c80a_4273_9b61_2954040a76fb.slice/crio-3a7372df40f2a253cb88a2c38aa41d75a2444cb46df11fb53d6fd98b0180435a WatchSource:0}: Error finding container 3a7372df40f2a253cb88a2c38aa41d75a2444cb46df11fb53d6fd98b0180435a: Status 404 returned error can't find the container with id 3a7372df40f2a253cb88a2c38aa41d75a2444cb46df11fb53d6fd98b0180435a Oct 03 11:21:58 crc kubenswrapper[4990]: I1003 11:21:58.885078 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:21:58 crc kubenswrapper[4990]: E1003 11:21:58.885480 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:21:59 crc kubenswrapper[4990]: I1003 11:21:59.674023 4990 generic.go:334] "Generic (PLEG): container finished" podID="dff835e0-d422-414c-a731-49acae82b765" containerID="318bc1cd1da7d59d2a0ce59a38d11f5b793b14747a103bd82edd8599fdba3b55" exitCode=0 Oct 03 11:21:59 crc kubenswrapper[4990]: I1003 11:21:59.674156 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7m2pb" event={"ID":"dff835e0-d422-414c-a731-49acae82b765","Type":"ContainerDied","Data":"318bc1cd1da7d59d2a0ce59a38d11f5b793b14747a103bd82edd8599fdba3b55"} Oct 03 11:21:59 crc kubenswrapper[4990]: I1003 11:21:59.676838 4990 generic.go:334] "Generic (PLEG): container finished" podID="d748b646-6b70-42c8-8bc5-f8ef381f4eab" containerID="119a8884ce5af0cffd38ae3b5fb2fb18b1887228525d4979be825480f86d614a" exitCode=0 Oct 03 11:21:59 crc kubenswrapper[4990]: I1003 11:21:59.676942 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jwwc2" event={"ID":"d748b646-6b70-42c8-8bc5-f8ef381f4eab","Type":"ContainerDied","Data":"119a8884ce5af0cffd38ae3b5fb2fb18b1887228525d4979be825480f86d614a"} Oct 03 11:21:59 crc kubenswrapper[4990]: I1003 11:21:59.679177 4990 generic.go:334] "Generic (PLEG): container finished" podID="c94f2641-c80a-4273-9b61-2954040a76fb" containerID="8a5f22c8c6643b4bb53283d20653ee9ccf9d6937ca3d23dac2430e1f1aa05997" exitCode=0 Oct 03 11:21:59 crc kubenswrapper[4990]: I1003 11:21:59.679238 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cd6sv" event={"ID":"c94f2641-c80a-4273-9b61-2954040a76fb","Type":"ContainerDied","Data":"8a5f22c8c6643b4bb53283d20653ee9ccf9d6937ca3d23dac2430e1f1aa05997"} Oct 03 11:21:59 crc kubenswrapper[4990]: I1003 11:21:59.679274 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cd6sv" event={"ID":"c94f2641-c80a-4273-9b61-2954040a76fb","Type":"ContainerStarted","Data":"3a7372df40f2a253cb88a2c38aa41d75a2444cb46df11fb53d6fd98b0180435a"} Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.207270 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cd6sv" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.212013 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jwwc2" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.217232 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7m2pb" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.303348 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8k64\" (UniqueName: \"kubernetes.io/projected/c94f2641-c80a-4273-9b61-2954040a76fb-kube-api-access-h8k64\") pod \"c94f2641-c80a-4273-9b61-2954040a76fb\" (UID: \"c94f2641-c80a-4273-9b61-2954040a76fb\") " Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.303680 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmzv7\" (UniqueName: \"kubernetes.io/projected/dff835e0-d422-414c-a731-49acae82b765-kube-api-access-pmzv7\") pod \"dff835e0-d422-414c-a731-49acae82b765\" (UID: \"dff835e0-d422-414c-a731-49acae82b765\") " Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.303745 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp9nf\" (UniqueName: \"kubernetes.io/projected/d748b646-6b70-42c8-8bc5-f8ef381f4eab-kube-api-access-sp9nf\") pod \"d748b646-6b70-42c8-8bc5-f8ef381f4eab\" (UID: \"d748b646-6b70-42c8-8bc5-f8ef381f4eab\") " Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.310997 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff835e0-d422-414c-a731-49acae82b765-kube-api-access-pmzv7" (OuterVolumeSpecName: "kube-api-access-pmzv7") pod "dff835e0-d422-414c-a731-49acae82b765" (UID: "dff835e0-d422-414c-a731-49acae82b765"). InnerVolumeSpecName "kube-api-access-pmzv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.312355 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d748b646-6b70-42c8-8bc5-f8ef381f4eab-kube-api-access-sp9nf" (OuterVolumeSpecName: "kube-api-access-sp9nf") pod "d748b646-6b70-42c8-8bc5-f8ef381f4eab" (UID: "d748b646-6b70-42c8-8bc5-f8ef381f4eab"). InnerVolumeSpecName "kube-api-access-sp9nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.315590 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94f2641-c80a-4273-9b61-2954040a76fb-kube-api-access-h8k64" (OuterVolumeSpecName: "kube-api-access-h8k64") pod "c94f2641-c80a-4273-9b61-2954040a76fb" (UID: "c94f2641-c80a-4273-9b61-2954040a76fb"). InnerVolumeSpecName "kube-api-access-h8k64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.406542 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8k64\" (UniqueName: \"kubernetes.io/projected/c94f2641-c80a-4273-9b61-2954040a76fb-kube-api-access-h8k64\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.406570 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmzv7\" (UniqueName: \"kubernetes.io/projected/dff835e0-d422-414c-a731-49acae82b765-kube-api-access-pmzv7\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.406579 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp9nf\" (UniqueName: \"kubernetes.io/projected/d748b646-6b70-42c8-8bc5-f8ef381f4eab-kube-api-access-sp9nf\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.724139 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jwwc2" event={"ID":"d748b646-6b70-42c8-8bc5-f8ef381f4eab","Type":"ContainerDied","Data":"300f853e39f446795e14c3b9eeab6b09e6200e5300709d3e2469dd5ec94a7c26"} Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.724182 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="300f853e39f446795e14c3b9eeab6b09e6200e5300709d3e2469dd5ec94a7c26" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.724204 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jwwc2" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.727001 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cd6sv" event={"ID":"c94f2641-c80a-4273-9b61-2954040a76fb","Type":"ContainerDied","Data":"3a7372df40f2a253cb88a2c38aa41d75a2444cb46df11fb53d6fd98b0180435a"} Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.727022 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cd6sv" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.727029 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a7372df40f2a253cb88a2c38aa41d75a2444cb46df11fb53d6fd98b0180435a" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.729577 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7m2pb" event={"ID":"dff835e0-d422-414c-a731-49acae82b765","Type":"ContainerDied","Data":"a62790ff5216163704e36c8d6162f06e0e170189d14ee9fa88251970299b5cb0"} Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.729635 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a62790ff5216163704e36c8d6162f06e0e170189d14ee9fa88251970299b5cb0" Oct 03 11:22:01 crc kubenswrapper[4990]: I1003 11:22:01.729606 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7m2pb" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.892129 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2683-account-create-2s8dt"] Oct 03 11:22:07 crc kubenswrapper[4990]: E1003 11:22:07.893347 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff835e0-d422-414c-a731-49acae82b765" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.893364 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff835e0-d422-414c-a731-49acae82b765" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: E1003 11:22:07.893393 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94f2641-c80a-4273-9b61-2954040a76fb" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.893402 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94f2641-c80a-4273-9b61-2954040a76fb" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: E1003 11:22:07.893424 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d748b646-6b70-42c8-8bc5-f8ef381f4eab" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.893432 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d748b646-6b70-42c8-8bc5-f8ef381f4eab" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.893689 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94f2641-c80a-4273-9b61-2954040a76fb" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.893721 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d748b646-6b70-42c8-8bc5-f8ef381f4eab" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.893740 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff835e0-d422-414c-a731-49acae82b765" containerName="mariadb-database-create" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.894447 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2683-account-create-2s8dt" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.896368 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 11:22:07 crc kubenswrapper[4990]: I1003 11:22:07.902395 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2683-account-create-2s8dt"] Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.037504 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbf8x\" (UniqueName: \"kubernetes.io/projected/93f7e561-6095-4bae-96dc-b5d4fc2db2ed-kube-api-access-kbf8x\") pod \"nova-api-2683-account-create-2s8dt\" (UID: \"93f7e561-6095-4bae-96dc-b5d4fc2db2ed\") " pod="openstack/nova-api-2683-account-create-2s8dt" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.086263 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-05a9-account-create-2sp9s"] Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.087458 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05a9-account-create-2sp9s" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.089805 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.105806 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-05a9-account-create-2sp9s"] Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.140317 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbf8x\" (UniqueName: \"kubernetes.io/projected/93f7e561-6095-4bae-96dc-b5d4fc2db2ed-kube-api-access-kbf8x\") pod \"nova-api-2683-account-create-2s8dt\" (UID: \"93f7e561-6095-4bae-96dc-b5d4fc2db2ed\") " pod="openstack/nova-api-2683-account-create-2s8dt" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.161959 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbf8x\" (UniqueName: \"kubernetes.io/projected/93f7e561-6095-4bae-96dc-b5d4fc2db2ed-kube-api-access-kbf8x\") pod \"nova-api-2683-account-create-2s8dt\" (UID: \"93f7e561-6095-4bae-96dc-b5d4fc2db2ed\") " pod="openstack/nova-api-2683-account-create-2s8dt" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.222938 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2683-account-create-2s8dt" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.242600 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppphw\" (UniqueName: \"kubernetes.io/projected/4d122437-a3dd-491f-98b1-0487982c980c-kube-api-access-ppphw\") pod \"nova-cell0-05a9-account-create-2sp9s\" (UID: \"4d122437-a3dd-491f-98b1-0487982c980c\") " pod="openstack/nova-cell0-05a9-account-create-2sp9s" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.304891 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e71c-account-create-s96zj"] Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.306679 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e71c-account-create-s96zj" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.311420 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.323368 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e71c-account-create-s96zj"] Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.345396 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppphw\" (UniqueName: \"kubernetes.io/projected/4d122437-a3dd-491f-98b1-0487982c980c-kube-api-access-ppphw\") pod \"nova-cell0-05a9-account-create-2sp9s\" (UID: \"4d122437-a3dd-491f-98b1-0487982c980c\") " pod="openstack/nova-cell0-05a9-account-create-2sp9s" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.376918 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppphw\" (UniqueName: \"kubernetes.io/projected/4d122437-a3dd-491f-98b1-0487982c980c-kube-api-access-ppphw\") pod \"nova-cell0-05a9-account-create-2sp9s\" (UID: \"4d122437-a3dd-491f-98b1-0487982c980c\") " pod="openstack/nova-cell0-05a9-account-create-2sp9s" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.426792 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05a9-account-create-2sp9s" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.447084 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25mrs\" (UniqueName: \"kubernetes.io/projected/51ede1ac-72d9-4a04-b026-e206a826e575-kube-api-access-25mrs\") pod \"nova-cell1-e71c-account-create-s96zj\" (UID: \"51ede1ac-72d9-4a04-b026-e206a826e575\") " pod="openstack/nova-cell1-e71c-account-create-s96zj" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.548917 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25mrs\" (UniqueName: \"kubernetes.io/projected/51ede1ac-72d9-4a04-b026-e206a826e575-kube-api-access-25mrs\") pod \"nova-cell1-e71c-account-create-s96zj\" (UID: \"51ede1ac-72d9-4a04-b026-e206a826e575\") " pod="openstack/nova-cell1-e71c-account-create-s96zj" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.565925 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25mrs\" (UniqueName: \"kubernetes.io/projected/51ede1ac-72d9-4a04-b026-e206a826e575-kube-api-access-25mrs\") pod \"nova-cell1-e71c-account-create-s96zj\" (UID: \"51ede1ac-72d9-4a04-b026-e206a826e575\") " pod="openstack/nova-cell1-e71c-account-create-s96zj" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.747239 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e71c-account-create-s96zj" Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.759936 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2683-account-create-2s8dt"] Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.798298 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2683-account-create-2s8dt" event={"ID":"93f7e561-6095-4bae-96dc-b5d4fc2db2ed","Type":"ContainerStarted","Data":"98bf0d8d91372fc36f08f9b3b72fab81c2b586aeafa444d099a55f38b8db9f0e"} Oct 03 11:22:08 crc kubenswrapper[4990]: I1003 11:22:08.867391 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-05a9-account-create-2sp9s"] Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.274896 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e71c-account-create-s96zj"] Oct 03 11:22:09 crc kubenswrapper[4990]: W1003 11:22:09.307165 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ede1ac_72d9_4a04_b026_e206a826e575.slice/crio-20ef275c071ff22555d815fb8e96821471c2f589b99f03a4ffca7a3a2527cf5e WatchSource:0}: Error finding container 20ef275c071ff22555d815fb8e96821471c2f589b99f03a4ffca7a3a2527cf5e: Status 404 returned error can't find the container with id 20ef275c071ff22555d815fb8e96821471c2f589b99f03a4ffca7a3a2527cf5e Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.807934 4990 generic.go:334] "Generic (PLEG): container finished" podID="93f7e561-6095-4bae-96dc-b5d4fc2db2ed" containerID="a806383d40edd08d38ac187c96d26dabf2df2239893c2bb6f3f5fb4b05ff6592" exitCode=0 Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.808054 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2683-account-create-2s8dt" event={"ID":"93f7e561-6095-4bae-96dc-b5d4fc2db2ed","Type":"ContainerDied","Data":"a806383d40edd08d38ac187c96d26dabf2df2239893c2bb6f3f5fb4b05ff6592"} Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.810007 4990 generic.go:334] "Generic (PLEG): container finished" podID="51ede1ac-72d9-4a04-b026-e206a826e575" containerID="deea82c013222624c66563b9ac58c134355070e19936c3007d70aad92ce980f8" exitCode=0 Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.810107 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e71c-account-create-s96zj" event={"ID":"51ede1ac-72d9-4a04-b026-e206a826e575","Type":"ContainerDied","Data":"deea82c013222624c66563b9ac58c134355070e19936c3007d70aad92ce980f8"} Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.810133 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e71c-account-create-s96zj" event={"ID":"51ede1ac-72d9-4a04-b026-e206a826e575","Type":"ContainerStarted","Data":"20ef275c071ff22555d815fb8e96821471c2f589b99f03a4ffca7a3a2527cf5e"} Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.811801 4990 generic.go:334] "Generic (PLEG): container finished" podID="4d122437-a3dd-491f-98b1-0487982c980c" containerID="9848e116d9e47e5d81a6ab1ffee80ba68d47bf6bfa13e1c0f5c9f038e5923d88" exitCode=0 Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.811836 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05a9-account-create-2sp9s" event={"ID":"4d122437-a3dd-491f-98b1-0487982c980c","Type":"ContainerDied","Data":"9848e116d9e47e5d81a6ab1ffee80ba68d47bf6bfa13e1c0f5c9f038e5923d88"} Oct 03 11:22:09 crc kubenswrapper[4990]: I1003 11:22:09.811856 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05a9-account-create-2sp9s" event={"ID":"4d122437-a3dd-491f-98b1-0487982c980c","Type":"ContainerStarted","Data":"ac5a2993e8f234784f173577ce110ab09f4e32894f47c8d3f54bce55296dd3e1"} Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.243074 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2683-account-create-2s8dt" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.250291 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05a9-account-create-2sp9s" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.273185 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e71c-account-create-s96zj" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.412955 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbf8x\" (UniqueName: \"kubernetes.io/projected/93f7e561-6095-4bae-96dc-b5d4fc2db2ed-kube-api-access-kbf8x\") pod \"93f7e561-6095-4bae-96dc-b5d4fc2db2ed\" (UID: \"93f7e561-6095-4bae-96dc-b5d4fc2db2ed\") " Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.413318 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25mrs\" (UniqueName: \"kubernetes.io/projected/51ede1ac-72d9-4a04-b026-e206a826e575-kube-api-access-25mrs\") pod \"51ede1ac-72d9-4a04-b026-e206a826e575\" (UID: \"51ede1ac-72d9-4a04-b026-e206a826e575\") " Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.413444 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppphw\" (UniqueName: \"kubernetes.io/projected/4d122437-a3dd-491f-98b1-0487982c980c-kube-api-access-ppphw\") pod \"4d122437-a3dd-491f-98b1-0487982c980c\" (UID: \"4d122437-a3dd-491f-98b1-0487982c980c\") " Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.419121 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f7e561-6095-4bae-96dc-b5d4fc2db2ed-kube-api-access-kbf8x" (OuterVolumeSpecName: "kube-api-access-kbf8x") pod "93f7e561-6095-4bae-96dc-b5d4fc2db2ed" (UID: "93f7e561-6095-4bae-96dc-b5d4fc2db2ed"). InnerVolumeSpecName "kube-api-access-kbf8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.419596 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ede1ac-72d9-4a04-b026-e206a826e575-kube-api-access-25mrs" (OuterVolumeSpecName: "kube-api-access-25mrs") pod "51ede1ac-72d9-4a04-b026-e206a826e575" (UID: "51ede1ac-72d9-4a04-b026-e206a826e575"). InnerVolumeSpecName "kube-api-access-25mrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.437837 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d122437-a3dd-491f-98b1-0487982c980c-kube-api-access-ppphw" (OuterVolumeSpecName: "kube-api-access-ppphw") pod "4d122437-a3dd-491f-98b1-0487982c980c" (UID: "4d122437-a3dd-491f-98b1-0487982c980c"). InnerVolumeSpecName "kube-api-access-ppphw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.516864 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbf8x\" (UniqueName: \"kubernetes.io/projected/93f7e561-6095-4bae-96dc-b5d4fc2db2ed-kube-api-access-kbf8x\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.516893 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25mrs\" (UniqueName: \"kubernetes.io/projected/51ede1ac-72d9-4a04-b026-e206a826e575-kube-api-access-25mrs\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.516918 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppphw\" (UniqueName: \"kubernetes.io/projected/4d122437-a3dd-491f-98b1-0487982c980c-kube-api-access-ppphw\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.831907 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2683-account-create-2s8dt" event={"ID":"93f7e561-6095-4bae-96dc-b5d4fc2db2ed","Type":"ContainerDied","Data":"98bf0d8d91372fc36f08f9b3b72fab81c2b586aeafa444d099a55f38b8db9f0e"} Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.832195 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98bf0d8d91372fc36f08f9b3b72fab81c2b586aeafa444d099a55f38b8db9f0e" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.832000 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2683-account-create-2s8dt" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.834833 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e71c-account-create-s96zj" event={"ID":"51ede1ac-72d9-4a04-b026-e206a826e575","Type":"ContainerDied","Data":"20ef275c071ff22555d815fb8e96821471c2f589b99f03a4ffca7a3a2527cf5e"} Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.835010 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20ef275c071ff22555d815fb8e96821471c2f589b99f03a4ffca7a3a2527cf5e" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.834859 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e71c-account-create-s96zj" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.837755 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-05a9-account-create-2sp9s" event={"ID":"4d122437-a3dd-491f-98b1-0487982c980c","Type":"ContainerDied","Data":"ac5a2993e8f234784f173577ce110ab09f4e32894f47c8d3f54bce55296dd3e1"} Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.837801 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac5a2993e8f234784f173577ce110ab09f4e32894f47c8d3f54bce55296dd3e1" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.837864 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-05a9-account-create-2sp9s" Oct 03 11:22:11 crc kubenswrapper[4990]: I1003 11:22:11.871217 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:22:11 crc kubenswrapper[4990]: E1003 11:22:11.871545 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.315723 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6nfpn"] Oct 03 11:22:13 crc kubenswrapper[4990]: E1003 11:22:13.316694 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f7e561-6095-4bae-96dc-b5d4fc2db2ed" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.316722 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f7e561-6095-4bae-96dc-b5d4fc2db2ed" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: E1003 11:22:13.316745 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d122437-a3dd-491f-98b1-0487982c980c" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.316753 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d122437-a3dd-491f-98b1-0487982c980c" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: E1003 11:22:13.316790 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ede1ac-72d9-4a04-b026-e206a826e575" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.316799 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ede1ac-72d9-4a04-b026-e206a826e575" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.317014 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ede1ac-72d9-4a04-b026-e206a826e575" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.317030 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d122437-a3dd-491f-98b1-0487982c980c" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.317047 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f7e561-6095-4bae-96dc-b5d4fc2db2ed" containerName="mariadb-account-create" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.317765 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.323195 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.323836 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.324164 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ls2gg" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.330122 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6nfpn"] Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.453309 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kbfm\" (UniqueName: \"kubernetes.io/projected/cdcadd49-4156-411d-9013-331a25abf52b-kube-api-access-6kbfm\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.453653 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-scripts\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.453749 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.453911 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-config-data\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.555966 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-config-data\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.556085 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kbfm\" (UniqueName: \"kubernetes.io/projected/cdcadd49-4156-411d-9013-331a25abf52b-kube-api-access-6kbfm\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.556142 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-scripts\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.556186 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.570636 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-config-data\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.570989 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-scripts\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.575661 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.586028 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kbfm\" (UniqueName: \"kubernetes.io/projected/cdcadd49-4156-411d-9013-331a25abf52b-kube-api-access-6kbfm\") pod \"nova-cell0-conductor-db-sync-6nfpn\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:13 crc kubenswrapper[4990]: I1003 11:22:13.641499 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:14 crc kubenswrapper[4990]: I1003 11:22:14.166021 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6nfpn"] Oct 03 11:22:14 crc kubenswrapper[4990]: W1003 11:22:14.180672 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdcadd49_4156_411d_9013_331a25abf52b.slice/crio-4feaf616ad94876d8bc59728663dbd82d80285a4fbec1e7a1ade36fe3bbea95e WatchSource:0}: Error finding container 4feaf616ad94876d8bc59728663dbd82d80285a4fbec1e7a1ade36fe3bbea95e: Status 404 returned error can't find the container with id 4feaf616ad94876d8bc59728663dbd82d80285a4fbec1e7a1ade36fe3bbea95e Oct 03 11:22:14 crc kubenswrapper[4990]: I1003 11:22:14.863894 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" event={"ID":"cdcadd49-4156-411d-9013-331a25abf52b","Type":"ContainerStarted","Data":"2568fa02d03f14b960503812d7d2fb5ccea241948fbf916153b54587912a556f"} Oct 03 11:22:14 crc kubenswrapper[4990]: I1003 11:22:14.864229 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" event={"ID":"cdcadd49-4156-411d-9013-331a25abf52b","Type":"ContainerStarted","Data":"4feaf616ad94876d8bc59728663dbd82d80285a4fbec1e7a1ade36fe3bbea95e"} Oct 03 11:22:14 crc kubenswrapper[4990]: I1003 11:22:14.885331 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" podStartSLOduration=1.8853167389999999 podStartE2EDuration="1.885316739s" podCreationTimestamp="2025-10-03 11:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:14.8837774 +0000 UTC m=+5916.680409267" watchObservedRunningTime="2025-10-03 11:22:14.885316739 +0000 UTC m=+5916.681948596" Oct 03 11:22:19 crc kubenswrapper[4990]: I1003 11:22:19.909234 4990 generic.go:334] "Generic (PLEG): container finished" podID="cdcadd49-4156-411d-9013-331a25abf52b" containerID="2568fa02d03f14b960503812d7d2fb5ccea241948fbf916153b54587912a556f" exitCode=0 Oct 03 11:22:19 crc kubenswrapper[4990]: I1003 11:22:19.909320 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" event={"ID":"cdcadd49-4156-411d-9013-331a25abf52b","Type":"ContainerDied","Data":"2568fa02d03f14b960503812d7d2fb5ccea241948fbf916153b54587912a556f"} Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.256260 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.399939 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-combined-ca-bundle\") pod \"cdcadd49-4156-411d-9013-331a25abf52b\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.400378 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-config-data\") pod \"cdcadd49-4156-411d-9013-331a25abf52b\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.400649 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-scripts\") pod \"cdcadd49-4156-411d-9013-331a25abf52b\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.400731 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kbfm\" (UniqueName: \"kubernetes.io/projected/cdcadd49-4156-411d-9013-331a25abf52b-kube-api-access-6kbfm\") pod \"cdcadd49-4156-411d-9013-331a25abf52b\" (UID: \"cdcadd49-4156-411d-9013-331a25abf52b\") " Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.406479 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdcadd49-4156-411d-9013-331a25abf52b-kube-api-access-6kbfm" (OuterVolumeSpecName: "kube-api-access-6kbfm") pod "cdcadd49-4156-411d-9013-331a25abf52b" (UID: "cdcadd49-4156-411d-9013-331a25abf52b"). InnerVolumeSpecName "kube-api-access-6kbfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.407320 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-scripts" (OuterVolumeSpecName: "scripts") pod "cdcadd49-4156-411d-9013-331a25abf52b" (UID: "cdcadd49-4156-411d-9013-331a25abf52b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.431925 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdcadd49-4156-411d-9013-331a25abf52b" (UID: "cdcadd49-4156-411d-9013-331a25abf52b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.433290 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-config-data" (OuterVolumeSpecName: "config-data") pod "cdcadd49-4156-411d-9013-331a25abf52b" (UID: "cdcadd49-4156-411d-9013-331a25abf52b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.503714 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.503759 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.503776 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kbfm\" (UniqueName: \"kubernetes.io/projected/cdcadd49-4156-411d-9013-331a25abf52b-kube-api-access-6kbfm\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.503790 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdcadd49-4156-411d-9013-331a25abf52b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.941373 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" event={"ID":"cdcadd49-4156-411d-9013-331a25abf52b","Type":"ContainerDied","Data":"4feaf616ad94876d8bc59728663dbd82d80285a4fbec1e7a1ade36fe3bbea95e"} Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.941411 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4feaf616ad94876d8bc59728663dbd82d80285a4fbec1e7a1ade36fe3bbea95e" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.941459 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6nfpn" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.999396 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 11:22:21 crc kubenswrapper[4990]: E1003 11:22:21.999849 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcadd49-4156-411d-9013-331a25abf52b" containerName="nova-cell0-conductor-db-sync" Oct 03 11:22:21 crc kubenswrapper[4990]: I1003 11:22:21.999873 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcadd49-4156-411d-9013-331a25abf52b" containerName="nova-cell0-conductor-db-sync" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.000134 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcadd49-4156-411d-9013-331a25abf52b" containerName="nova-cell0-conductor-db-sync" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.001061 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.005713 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ls2gg" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.005919 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.009563 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.114309 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.114412 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vr7f\" (UniqueName: \"kubernetes.io/projected/a33bb044-a557-41e0-93be-9160096406a3-kube-api-access-6vr7f\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.114439 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.216325 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vr7f\" (UniqueName: \"kubernetes.io/projected/a33bb044-a557-41e0-93be-9160096406a3-kube-api-access-6vr7f\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.216395 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.216554 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.224843 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.236264 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.250415 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vr7f\" (UniqueName: \"kubernetes.io/projected/a33bb044-a557-41e0-93be-9160096406a3-kube-api-access-6vr7f\") pod \"nova-cell0-conductor-0\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.324582 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.778829 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 11:22:22 crc kubenswrapper[4990]: I1003 11:22:22.952742 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a33bb044-a557-41e0-93be-9160096406a3","Type":"ContainerStarted","Data":"805ca2768f75f6565093d75f167f6bae3cfb6528579fab31b3d4d2e2b2f4691d"} Oct 03 11:22:23 crc kubenswrapper[4990]: I1003 11:22:23.968978 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a33bb044-a557-41e0-93be-9160096406a3","Type":"ContainerStarted","Data":"c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8"} Oct 03 11:22:23 crc kubenswrapper[4990]: I1003 11:22:23.969419 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:25 crc kubenswrapper[4990]: I1003 11:22:25.872198 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:22:25 crc kubenswrapper[4990]: E1003 11:22:25.872594 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.358470 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.381948 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=6.381930525 podStartE2EDuration="6.381930525s" podCreationTimestamp="2025-10-03 11:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:23.995381713 +0000 UTC m=+5925.792013620" watchObservedRunningTime="2025-10-03 11:22:27.381930525 +0000 UTC m=+5929.178562382" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.862664 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-2vjkx"] Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.864227 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.867663 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.869984 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.878025 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2vjkx"] Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.919211 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-config-data\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.919287 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5x4g\" (UniqueName: \"kubernetes.io/projected/d2766af4-5f69-4efa-9f3d-0899a146114a-kube-api-access-w5x4g\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.919339 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-scripts\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:27 crc kubenswrapper[4990]: I1003 11:22:27.919363 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.021131 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-config-data\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.021187 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5x4g\" (UniqueName: \"kubernetes.io/projected/d2766af4-5f69-4efa-9f3d-0899a146114a-kube-api-access-w5x4g\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.021236 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-scripts\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.021256 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.023422 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.025112 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.028737 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.033265 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-config-data\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.039257 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.039292 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-scripts\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.061998 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.067031 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5x4g\" (UniqueName: \"kubernetes.io/projected/d2766af4-5f69-4efa-9f3d-0899a146114a-kube-api-access-w5x4g\") pod \"nova-cell0-cell-mapping-2vjkx\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.076806 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.078397 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.084943 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.099225 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.131666 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmhgt\" (UniqueName: \"kubernetes.io/projected/e763f685-b474-4669-a91b-ad58f30fbe57-kube-api-access-bmhgt\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.131754 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-config-data\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.131826 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e763f685-b474-4669-a91b-ad58f30fbe57-logs\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.131857 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.190667 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.226056 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.260474 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.263247 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-config-data\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.263321 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241395b7-bdfc-4f2d-be24-daa4b77281db-logs\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.277244 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-config-data\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.277311 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzsgj\" (UniqueName: \"kubernetes.io/projected/241395b7-bdfc-4f2d-be24-daa4b77281db-kube-api-access-bzsgj\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.277311 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-config-data\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.277439 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e763f685-b474-4669-a91b-ad58f30fbe57-logs\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.277486 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.277620 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.277714 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmhgt\" (UniqueName: \"kubernetes.io/projected/e763f685-b474-4669-a91b-ad58f30fbe57-kube-api-access-bmhgt\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.278026 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e763f685-b474-4669-a91b-ad58f30fbe57-logs\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.282623 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.282783 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.301886 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.314451 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64678d8c55-rnd9s"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.316464 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmhgt\" (UniqueName: \"kubernetes.io/projected/e763f685-b474-4669-a91b-ad58f30fbe57-kube-api-access-bmhgt\") pod \"nova-api-0\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.321093 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.344676 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.345920 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.352840 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.364395 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64678d8c55-rnd9s"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383163 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzsgj\" (UniqueName: \"kubernetes.io/projected/241395b7-bdfc-4f2d-be24-daa4b77281db-kube-api-access-bzsgj\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383251 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383286 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-config-data\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383304 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383325 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-dns-svc\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383348 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt59m\" (UniqueName: \"kubernetes.io/projected/b9839bef-888b-426e-b1c7-7bed97a96c77-kube-api-access-vt59m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383371 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-config\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383403 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-nb\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383421 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-sb\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383438 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383454 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383481 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wc7d\" (UniqueName: \"kubernetes.io/projected/711df095-f8d6-419b-8586-e738ca12b49c-kube-api-access-6wc7d\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383499 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241395b7-bdfc-4f2d-be24-daa4b77281db-logs\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383533 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-config-data\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.383554 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx5qb\" (UniqueName: \"kubernetes.io/projected/c29e3677-3e1b-48e4-91f5-13d1c959732d-kube-api-access-cx5qb\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.384945 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241395b7-bdfc-4f2d-be24-daa4b77281db-logs\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.384970 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.391849 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.393677 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-config-data\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.419043 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzsgj\" (UniqueName: \"kubernetes.io/projected/241395b7-bdfc-4f2d-be24-daa4b77281db-kube-api-access-bzsgj\") pod \"nova-metadata-0\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.482641 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.487896 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.488061 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-config-data\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.488172 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-dns-svc\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.488276 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt59m\" (UniqueName: \"kubernetes.io/projected/b9839bef-888b-426e-b1c7-7bed97a96c77-kube-api-access-vt59m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.488395 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-config\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.488527 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-nb\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.488646 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-sb\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.488740 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.490204 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.490145 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-nb\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.490093 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-dns-svc\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.490299 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-config\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.490641 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wc7d\" (UniqueName: \"kubernetes.io/projected/711df095-f8d6-419b-8586-e738ca12b49c-kube-api-access-6wc7d\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.490710 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx5qb\" (UniqueName: \"kubernetes.io/projected/c29e3677-3e1b-48e4-91f5-13d1c959732d-kube-api-access-cx5qb\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.490811 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-sb\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.492655 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.498356 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.505781 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.506323 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt59m\" (UniqueName: \"kubernetes.io/projected/b9839bef-888b-426e-b1c7-7bed97a96c77-kube-api-access-vt59m\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.508184 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.509077 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wc7d\" (UniqueName: \"kubernetes.io/projected/711df095-f8d6-419b-8586-e738ca12b49c-kube-api-access-6wc7d\") pod \"dnsmasq-dns-64678d8c55-rnd9s\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.510104 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-config-data\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.510978 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx5qb\" (UniqueName: \"kubernetes.io/projected/c29e3677-3e1b-48e4-91f5-13d1c959732d-kube-api-access-cx5qb\") pod \"nova-scheduler-0\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.707548 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.719577 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.728702 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.758071 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2vjkx"] Oct 03 11:22:28 crc kubenswrapper[4990]: W1003 11:22:28.782018 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2766af4_5f69_4efa_9f3d_0899a146114a.slice/crio-715c90e873b1ac11911fbb600c826557ece22f39df1775b325592baafee09749 WatchSource:0}: Error finding container 715c90e873b1ac11911fbb600c826557ece22f39df1775b325592baafee09749: Status 404 returned error can't find the container with id 715c90e873b1ac11911fbb600c826557ece22f39df1775b325592baafee09749 Oct 03 11:22:28 crc kubenswrapper[4990]: I1003 11:22:28.813692 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:28 crc kubenswrapper[4990]: W1003 11:22:28.832685 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode763f685_b474_4669_a91b_ad58f30fbe57.slice/crio-05df0f36124b2d1df217d3612a984aaef8daf89e26c1f671bdeb3cc0aab2e50c WatchSource:0}: Error finding container 05df0f36124b2d1df217d3612a984aaef8daf89e26c1f671bdeb3cc0aab2e50c: Status 404 returned error can't find the container with id 05df0f36124b2d1df217d3612a984aaef8daf89e26c1f671bdeb3cc0aab2e50c Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.045266 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e763f685-b474-4669-a91b-ad58f30fbe57","Type":"ContainerStarted","Data":"05df0f36124b2d1df217d3612a984aaef8daf89e26c1f671bdeb3cc0aab2e50c"} Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.051371 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2vjkx" event={"ID":"d2766af4-5f69-4efa-9f3d-0899a146114a","Type":"ContainerStarted","Data":"715c90e873b1ac11911fbb600c826557ece22f39df1775b325592baafee09749"} Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.121973 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qpvxf"] Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.123256 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.131978 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.132231 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.134085 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qpvxf"] Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.145131 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.309232 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.309825 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmhkz\" (UniqueName: \"kubernetes.io/projected/46f302eb-c586-4bf2-aa6b-31ae2585cace-kube-api-access-wmhkz\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.309913 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-config-data\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.309956 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-scripts\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.337242 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.422489 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.422548 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmhkz\" (UniqueName: \"kubernetes.io/projected/46f302eb-c586-4bf2-aa6b-31ae2585cace-kube-api-access-wmhkz\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.422631 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-config-data\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.422667 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-scripts\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.429154 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.434185 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-config-data\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.440073 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-scripts\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.443851 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmhkz\" (UniqueName: \"kubernetes.io/projected/46f302eb-c586-4bf2-aa6b-31ae2585cace-kube-api-access-wmhkz\") pod \"nova-cell1-conductor-db-sync-qpvxf\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.452320 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.457438 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.484433 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64678d8c55-rnd9s"] Oct 03 11:22:29 crc kubenswrapper[4990]: I1003 11:22:29.996761 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qpvxf"] Oct 03 11:22:29 crc kubenswrapper[4990]: W1003 11:22:29.997745 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46f302eb_c586_4bf2_aa6b_31ae2585cace.slice/crio-cd0a45a86d4c02eeb69c4bd320fb311d010a3e3b0958fa0fd8d94cef3e5733dc WatchSource:0}: Error finding container cd0a45a86d4c02eeb69c4bd320fb311d010a3e3b0958fa0fd8d94cef3e5733dc: Status 404 returned error can't find the container with id cd0a45a86d4c02eeb69c4bd320fb311d010a3e3b0958fa0fd8d94cef3e5733dc Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.087497 4990 generic.go:334] "Generic (PLEG): container finished" podID="711df095-f8d6-419b-8586-e738ca12b49c" containerID="c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223" exitCode=0 Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.087823 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" event={"ID":"711df095-f8d6-419b-8586-e738ca12b49c","Type":"ContainerDied","Data":"c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.087861 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" event={"ID":"711df095-f8d6-419b-8586-e738ca12b49c","Type":"ContainerStarted","Data":"9e6f2c18980df4a9f015fe879ffecddb31b2bd81bb172318eb0be2278f021d68"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.100333 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9839bef-888b-426e-b1c7-7bed97a96c77","Type":"ContainerStarted","Data":"f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.100564 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9839bef-888b-426e-b1c7-7bed97a96c77","Type":"ContainerStarted","Data":"9bfb2b0f5007f4ae17337dc134dc0691ad0a8e820a184a9976408a3ab22c2148"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.106801 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c29e3677-3e1b-48e4-91f5-13d1c959732d","Type":"ContainerStarted","Data":"799c1984d3983906894a732c56ac70f6875f796b71619e0e5c76c2911d2cce2b"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.106857 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c29e3677-3e1b-48e4-91f5-13d1c959732d","Type":"ContainerStarted","Data":"91f2c82899ddf2c88e6a06c4a469b29a41c64d896b566a5286b0238758c4d521"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.116855 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2vjkx" event={"ID":"d2766af4-5f69-4efa-9f3d-0899a146114a","Type":"ContainerStarted","Data":"b50ab588823a7afca54e2ade2be90f69add87860677827e6704600421d118bb9"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.120991 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" event={"ID":"46f302eb-c586-4bf2-aa6b-31ae2585cace","Type":"ContainerStarted","Data":"cd0a45a86d4c02eeb69c4bd320fb311d010a3e3b0958fa0fd8d94cef3e5733dc"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.152908 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241395b7-bdfc-4f2d-be24-daa4b77281db","Type":"ContainerStarted","Data":"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.152956 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241395b7-bdfc-4f2d-be24-daa4b77281db","Type":"ContainerStarted","Data":"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.152965 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241395b7-bdfc-4f2d-be24-daa4b77281db","Type":"ContainerStarted","Data":"01b3869fb7af0abb94b05b81b2300ef1eadc2b22a9f13c312cb48f9061e4da6d"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.162031 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.162010774 podStartE2EDuration="2.162010774s" podCreationTimestamp="2025-10-03 11:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:30.16033752 +0000 UTC m=+5931.956969387" watchObservedRunningTime="2025-10-03 11:22:30.162010774 +0000 UTC m=+5931.958642631" Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.163245 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e763f685-b474-4669-a91b-ad58f30fbe57","Type":"ContainerStarted","Data":"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.163277 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e763f685-b474-4669-a91b-ad58f30fbe57","Type":"ContainerStarted","Data":"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552"} Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.241436 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.241405601 podStartE2EDuration="2.241405601s" podCreationTimestamp="2025-10-03 11:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:30.17969084 +0000 UTC m=+5931.976322707" watchObservedRunningTime="2025-10-03 11:22:30.241405601 +0000 UTC m=+5932.038037458" Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.250045 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-2vjkx" podStartSLOduration=3.250030014 podStartE2EDuration="3.250030014s" podCreationTimestamp="2025-10-03 11:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:30.203691039 +0000 UTC m=+5932.000322906" watchObservedRunningTime="2025-10-03 11:22:30.250030014 +0000 UTC m=+5932.046661861" Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.267873 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.267853504 podStartE2EDuration="2.267853504s" podCreationTimestamp="2025-10-03 11:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:30.234551875 +0000 UTC m=+5932.031183742" watchObservedRunningTime="2025-10-03 11:22:30.267853504 +0000 UTC m=+5932.064485361" Oct 03 11:22:30 crc kubenswrapper[4990]: I1003 11:22:30.279702 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.279497104 podStartE2EDuration="2.279497104s" podCreationTimestamp="2025-10-03 11:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:30.27546851 +0000 UTC m=+5932.072100387" watchObservedRunningTime="2025-10-03 11:22:30.279497104 +0000 UTC m=+5932.076128971" Oct 03 11:22:31 crc kubenswrapper[4990]: I1003 11:22:31.175593 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" event={"ID":"46f302eb-c586-4bf2-aa6b-31ae2585cace","Type":"ContainerStarted","Data":"259d004c8ffd58a92028951ef2bd6187492cfcae12954cd45c79af459b54cc13"} Oct 03 11:22:31 crc kubenswrapper[4990]: I1003 11:22:31.179726 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" event={"ID":"711df095-f8d6-419b-8586-e738ca12b49c","Type":"ContainerStarted","Data":"2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d"} Oct 03 11:22:31 crc kubenswrapper[4990]: I1003 11:22:31.179779 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:31 crc kubenswrapper[4990]: I1003 11:22:31.201545 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" podStartSLOduration=2.201524907 podStartE2EDuration="2.201524907s" podCreationTimestamp="2025-10-03 11:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:31.19582041 +0000 UTC m=+5932.992452297" watchObservedRunningTime="2025-10-03 11:22:31.201524907 +0000 UTC m=+5932.998156764" Oct 03 11:22:31 crc kubenswrapper[4990]: I1003 11:22:31.224669 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" podStartSLOduration=3.224645513 podStartE2EDuration="3.224645513s" podCreationTimestamp="2025-10-03 11:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:31.212998353 +0000 UTC m=+5933.009630230" watchObservedRunningTime="2025-10-03 11:22:31.224645513 +0000 UTC m=+5933.021277370" Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.228123 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.228658 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b9839bef-888b-426e-b1c7-7bed97a96c77" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093" gracePeriod=30 Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.237838 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.238085 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerName="nova-metadata-log" containerID="cri-o://6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7" gracePeriod=30 Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.238590 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerName="nova-metadata-metadata" containerID="cri-o://acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8" gracePeriod=30 Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.807228 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.908744 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-config-data\") pod \"241395b7-bdfc-4f2d-be24-daa4b77281db\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.908870 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241395b7-bdfc-4f2d-be24-daa4b77281db-logs\") pod \"241395b7-bdfc-4f2d-be24-daa4b77281db\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.909037 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzsgj\" (UniqueName: \"kubernetes.io/projected/241395b7-bdfc-4f2d-be24-daa4b77281db-kube-api-access-bzsgj\") pod \"241395b7-bdfc-4f2d-be24-daa4b77281db\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.909125 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-combined-ca-bundle\") pod \"241395b7-bdfc-4f2d-be24-daa4b77281db\" (UID: \"241395b7-bdfc-4f2d-be24-daa4b77281db\") " Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.909461 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/241395b7-bdfc-4f2d-be24-daa4b77281db-logs" (OuterVolumeSpecName: "logs") pod "241395b7-bdfc-4f2d-be24-daa4b77281db" (UID: "241395b7-bdfc-4f2d-be24-daa4b77281db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.909604 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/241395b7-bdfc-4f2d-be24-daa4b77281db-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.915863 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.916868 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241395b7-bdfc-4f2d-be24-daa4b77281db-kube-api-access-bzsgj" (OuterVolumeSpecName: "kube-api-access-bzsgj") pod "241395b7-bdfc-4f2d-be24-daa4b77281db" (UID: "241395b7-bdfc-4f2d-be24-daa4b77281db"). InnerVolumeSpecName "kube-api-access-bzsgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.936042 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "241395b7-bdfc-4f2d-be24-daa4b77281db" (UID: "241395b7-bdfc-4f2d-be24-daa4b77281db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:32 crc kubenswrapper[4990]: I1003 11:22:32.938192 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-config-data" (OuterVolumeSpecName: "config-data") pod "241395b7-bdfc-4f2d-be24-daa4b77281db" (UID: "241395b7-bdfc-4f2d-be24-daa4b77281db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.011457 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzsgj\" (UniqueName: \"kubernetes.io/projected/241395b7-bdfc-4f2d-be24-daa4b77281db-kube-api-access-bzsgj\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.011503 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.011528 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/241395b7-bdfc-4f2d-be24-daa4b77281db-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.112493 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-config-data\") pod \"b9839bef-888b-426e-b1c7-7bed97a96c77\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.112686 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-combined-ca-bundle\") pod \"b9839bef-888b-426e-b1c7-7bed97a96c77\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.112793 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt59m\" (UniqueName: \"kubernetes.io/projected/b9839bef-888b-426e-b1c7-7bed97a96c77-kube-api-access-vt59m\") pod \"b9839bef-888b-426e-b1c7-7bed97a96c77\" (UID: \"b9839bef-888b-426e-b1c7-7bed97a96c77\") " Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.116113 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9839bef-888b-426e-b1c7-7bed97a96c77-kube-api-access-vt59m" (OuterVolumeSpecName: "kube-api-access-vt59m") pod "b9839bef-888b-426e-b1c7-7bed97a96c77" (UID: "b9839bef-888b-426e-b1c7-7bed97a96c77"). InnerVolumeSpecName "kube-api-access-vt59m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.138185 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-config-data" (OuterVolumeSpecName: "config-data") pod "b9839bef-888b-426e-b1c7-7bed97a96c77" (UID: "b9839bef-888b-426e-b1c7-7bed97a96c77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.140444 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9839bef-888b-426e-b1c7-7bed97a96c77" (UID: "b9839bef-888b-426e-b1c7-7bed97a96c77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.198815 4990 generic.go:334] "Generic (PLEG): container finished" podID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerID="acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8" exitCode=0 Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.198854 4990 generic.go:334] "Generic (PLEG): container finished" podID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerID="6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7" exitCode=143 Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.198860 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241395b7-bdfc-4f2d-be24-daa4b77281db","Type":"ContainerDied","Data":"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8"} Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.198900 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241395b7-bdfc-4f2d-be24-daa4b77281db","Type":"ContainerDied","Data":"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7"} Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.198910 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"241395b7-bdfc-4f2d-be24-daa4b77281db","Type":"ContainerDied","Data":"01b3869fb7af0abb94b05b81b2300ef1eadc2b22a9f13c312cb48f9061e4da6d"} Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.198925 4990 scope.go:117] "RemoveContainer" containerID="acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.198922 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.201264 4990 generic.go:334] "Generic (PLEG): container finished" podID="b9839bef-888b-426e-b1c7-7bed97a96c77" containerID="f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093" exitCode=0 Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.201287 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9839bef-888b-426e-b1c7-7bed97a96c77","Type":"ContainerDied","Data":"f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093"} Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.201304 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9839bef-888b-426e-b1c7-7bed97a96c77","Type":"ContainerDied","Data":"9bfb2b0f5007f4ae17337dc134dc0691ad0a8e820a184a9976408a3ab22c2148"} Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.201344 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.215001 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt59m\" (UniqueName: \"kubernetes.io/projected/b9839bef-888b-426e-b1c7-7bed97a96c77-kube-api-access-vt59m\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.215033 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.215042 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9839bef-888b-426e-b1c7-7bed97a96c77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.246221 4990 scope.go:117] "RemoveContainer" containerID="6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.268470 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.286953 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.323566 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:33 crc kubenswrapper[4990]: E1003 11:22:33.323937 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerName="nova-metadata-metadata" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.323953 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerName="nova-metadata-metadata" Oct 03 11:22:33 crc kubenswrapper[4990]: E1003 11:22:33.323963 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9839bef-888b-426e-b1c7-7bed97a96c77" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.323969 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9839bef-888b-426e-b1c7-7bed97a96c77" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 11:22:33 crc kubenswrapper[4990]: E1003 11:22:33.323982 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerName="nova-metadata-log" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.323992 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerName="nova-metadata-log" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.324231 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerName="nova-metadata-log" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.324263 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" containerName="nova-metadata-metadata" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.324291 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9839bef-888b-426e-b1c7-7bed97a96c77" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.325080 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.335148 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.335342 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.335150 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.352223 4990 scope.go:117] "RemoveContainer" containerID="acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8" Oct 03 11:22:33 crc kubenswrapper[4990]: E1003 11:22:33.353973 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8\": container with ID starting with acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8 not found: ID does not exist" containerID="acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.354029 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8"} err="failed to get container status \"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8\": rpc error: code = NotFound desc = could not find container \"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8\": container with ID starting with acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8 not found: ID does not exist" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.354068 4990 scope.go:117] "RemoveContainer" containerID="6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7" Oct 03 11:22:33 crc kubenswrapper[4990]: E1003 11:22:33.355907 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7\": container with ID starting with 6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7 not found: ID does not exist" containerID="6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.356711 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7"} err="failed to get container status \"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7\": rpc error: code = NotFound desc = could not find container \"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7\": container with ID starting with 6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7 not found: ID does not exist" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.356745 4990 scope.go:117] "RemoveContainer" containerID="acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.357159 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8"} err="failed to get container status \"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8\": rpc error: code = NotFound desc = could not find container \"acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8\": container with ID starting with acd95a0f4a199f91043637d90eedb61ae2cf1e4028fa49fd38e354dec3d0aef8 not found: ID does not exist" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.360561 4990 scope.go:117] "RemoveContainer" containerID="6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.361172 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7"} err="failed to get container status \"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7\": rpc error: code = NotFound desc = could not find container \"6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7\": container with ID starting with 6c12c93765100ada8f379ccb0aa6db7b2f81692666535746189001a40b566ba7 not found: ID does not exist" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.361288 4990 scope.go:117] "RemoveContainer" containerID="f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.375317 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.400372 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.411366 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.420421 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.423631 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.428389 4990 scope.go:117] "RemoveContainer" containerID="f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.428497 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.428740 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 11:22:33 crc kubenswrapper[4990]: E1003 11:22:33.429012 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093\": container with ID starting with f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093 not found: ID does not exist" containerID="f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.429052 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093"} err="failed to get container status \"f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093\": rpc error: code = NotFound desc = could not find container \"f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093\": container with ID starting with f25a8a09e19df427b7055dc500dc70b9cd2158a1ec7b1488091802bf63be7093 not found: ID does not exist" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.431435 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.526820 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-config-data\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.526869 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.526896 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bxs5\" (UniqueName: \"kubernetes.io/projected/728ddc87-15e1-43d4-9856-51da796df9c8-kube-api-access-4bxs5\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.527215 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.527241 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.527312 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728ddc87-15e1-43d4-9856-51da796df9c8-logs\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.527382 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.527425 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.527455 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krfkj\" (UniqueName: \"kubernetes.io/projected/6ee22400-6c67-47ea-8660-221b585bf898-kube-api-access-krfkj\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.527477 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.628698 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629016 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629050 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krfkj\" (UniqueName: \"kubernetes.io/projected/6ee22400-6c67-47ea-8660-221b585bf898-kube-api-access-krfkj\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629078 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629121 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-config-data\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629142 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629165 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bxs5\" (UniqueName: \"kubernetes.io/projected/728ddc87-15e1-43d4-9856-51da796df9c8-kube-api-access-4bxs5\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629186 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629210 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629248 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728ddc87-15e1-43d4-9856-51da796df9c8-logs\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.629662 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728ddc87-15e1-43d4-9856-51da796df9c8-logs\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.632703 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.632725 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.635602 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.636178 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.637038 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-config-data\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.638293 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.638771 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ee22400-6c67-47ea-8660-221b585bf898-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.649220 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bxs5\" (UniqueName: \"kubernetes.io/projected/728ddc87-15e1-43d4-9856-51da796df9c8-kube-api-access-4bxs5\") pod \"nova-metadata-0\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.655972 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krfkj\" (UniqueName: \"kubernetes.io/projected/6ee22400-6c67-47ea-8660-221b585bf898-kube-api-access-krfkj\") pod \"nova-cell1-novncproxy-0\" (UID: \"6ee22400-6c67-47ea-8660-221b585bf898\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.729849 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.749629 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:33 crc kubenswrapper[4990]: I1003 11:22:33.948787 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:34 crc kubenswrapper[4990]: I1003 11:22:34.215369 4990 generic.go:334] "Generic (PLEG): container finished" podID="46f302eb-c586-4bf2-aa6b-31ae2585cace" containerID="259d004c8ffd58a92028951ef2bd6187492cfcae12954cd45c79af459b54cc13" exitCode=0 Oct 03 11:22:34 crc kubenswrapper[4990]: I1003 11:22:34.215442 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" event={"ID":"46f302eb-c586-4bf2-aa6b-31ae2585cace","Type":"ContainerDied","Data":"259d004c8ffd58a92028951ef2bd6187492cfcae12954cd45c79af459b54cc13"} Oct 03 11:22:34 crc kubenswrapper[4990]: I1003 11:22:34.239296 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:34 crc kubenswrapper[4990]: W1003 11:22:34.241253 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod728ddc87_15e1_43d4_9856_51da796df9c8.slice/crio-c88d95402974a3982596eaba8da1b01a124c870c75eceb0a86c9d25fe2b5c73b WatchSource:0}: Error finding container c88d95402974a3982596eaba8da1b01a124c870c75eceb0a86c9d25fe2b5c73b: Status 404 returned error can't find the container with id c88d95402974a3982596eaba8da1b01a124c870c75eceb0a86c9d25fe2b5c73b Oct 03 11:22:34 crc kubenswrapper[4990]: I1003 11:22:34.365896 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 11:22:34 crc kubenswrapper[4990]: W1003 11:22:34.375048 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ee22400_6c67_47ea_8660_221b585bf898.slice/crio-9d27841f577d93d2d3268ef93b0444de65cf45ec1d08873d0df508b0f7750601 WatchSource:0}: Error finding container 9d27841f577d93d2d3268ef93b0444de65cf45ec1d08873d0df508b0f7750601: Status 404 returned error can't find the container with id 9d27841f577d93d2d3268ef93b0444de65cf45ec1d08873d0df508b0f7750601 Oct 03 11:22:34 crc kubenswrapper[4990]: I1003 11:22:34.883604 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241395b7-bdfc-4f2d-be24-daa4b77281db" path="/var/lib/kubelet/pods/241395b7-bdfc-4f2d-be24-daa4b77281db/volumes" Oct 03 11:22:34 crc kubenswrapper[4990]: I1003 11:22:34.884593 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9839bef-888b-426e-b1c7-7bed97a96c77" path="/var/lib/kubelet/pods/b9839bef-888b-426e-b1c7-7bed97a96c77/volumes" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.229442 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"728ddc87-15e1-43d4-9856-51da796df9c8","Type":"ContainerStarted","Data":"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9"} Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.229498 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"728ddc87-15e1-43d4-9856-51da796df9c8","Type":"ContainerStarted","Data":"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2"} Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.229533 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"728ddc87-15e1-43d4-9856-51da796df9c8","Type":"ContainerStarted","Data":"c88d95402974a3982596eaba8da1b01a124c870c75eceb0a86c9d25fe2b5c73b"} Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.232277 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ee22400-6c67-47ea-8660-221b585bf898","Type":"ContainerStarted","Data":"20c1b9487221846fb10d2ab0959003f4d24b10e28ea430b29fa972541013ac14"} Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.232347 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6ee22400-6c67-47ea-8660-221b585bf898","Type":"ContainerStarted","Data":"9d27841f577d93d2d3268ef93b0444de65cf45ec1d08873d0df508b0f7750601"} Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.245088 4990 generic.go:334] "Generic (PLEG): container finished" podID="d2766af4-5f69-4efa-9f3d-0899a146114a" containerID="b50ab588823a7afca54e2ade2be90f69add87860677827e6704600421d118bb9" exitCode=0 Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.245234 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2vjkx" event={"ID":"d2766af4-5f69-4efa-9f3d-0899a146114a","Type":"ContainerDied","Data":"b50ab588823a7afca54e2ade2be90f69add87860677827e6704600421d118bb9"} Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.300322 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.300304051 podStartE2EDuration="2.300304051s" podCreationTimestamp="2025-10-03 11:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:35.271219611 +0000 UTC m=+5937.067851498" watchObservedRunningTime="2025-10-03 11:22:35.300304051 +0000 UTC m=+5937.096935908" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.312962 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.312943147 podStartE2EDuration="2.312943147s" podCreationTimestamp="2025-10-03 11:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:35.310233027 +0000 UTC m=+5937.106864894" watchObservedRunningTime="2025-10-03 11:22:35.312943147 +0000 UTC m=+5937.109575004" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.587050 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.781452 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-combined-ca-bundle\") pod \"46f302eb-c586-4bf2-aa6b-31ae2585cace\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.781569 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-config-data\") pod \"46f302eb-c586-4bf2-aa6b-31ae2585cace\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.781611 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmhkz\" (UniqueName: \"kubernetes.io/projected/46f302eb-c586-4bf2-aa6b-31ae2585cace-kube-api-access-wmhkz\") pod \"46f302eb-c586-4bf2-aa6b-31ae2585cace\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.781729 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-scripts\") pod \"46f302eb-c586-4bf2-aa6b-31ae2585cace\" (UID: \"46f302eb-c586-4bf2-aa6b-31ae2585cace\") " Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.789759 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f302eb-c586-4bf2-aa6b-31ae2585cace-kube-api-access-wmhkz" (OuterVolumeSpecName: "kube-api-access-wmhkz") pod "46f302eb-c586-4bf2-aa6b-31ae2585cace" (UID: "46f302eb-c586-4bf2-aa6b-31ae2585cace"). InnerVolumeSpecName "kube-api-access-wmhkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.793728 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-scripts" (OuterVolumeSpecName: "scripts") pod "46f302eb-c586-4bf2-aa6b-31ae2585cace" (UID: "46f302eb-c586-4bf2-aa6b-31ae2585cace"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.832943 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-config-data" (OuterVolumeSpecName: "config-data") pod "46f302eb-c586-4bf2-aa6b-31ae2585cace" (UID: "46f302eb-c586-4bf2-aa6b-31ae2585cace"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.833798 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46f302eb-c586-4bf2-aa6b-31ae2585cace" (UID: "46f302eb-c586-4bf2-aa6b-31ae2585cace"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.883804 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.883832 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.883841 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmhkz\" (UniqueName: \"kubernetes.io/projected/46f302eb-c586-4bf2-aa6b-31ae2585cace-kube-api-access-wmhkz\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:35 crc kubenswrapper[4990]: I1003 11:22:35.883849 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f302eb-c586-4bf2-aa6b-31ae2585cace-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.257330 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" event={"ID":"46f302eb-c586-4bf2-aa6b-31ae2585cace","Type":"ContainerDied","Data":"cd0a45a86d4c02eeb69c4bd320fb311d010a3e3b0958fa0fd8d94cef3e5733dc"} Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.257382 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd0a45a86d4c02eeb69c4bd320fb311d010a3e3b0958fa0fd8d94cef3e5733dc" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.257382 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qpvxf" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.368646 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 11:22:36 crc kubenswrapper[4990]: E1003 11:22:36.372578 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f302eb-c586-4bf2-aa6b-31ae2585cace" containerName="nova-cell1-conductor-db-sync" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.372609 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f302eb-c586-4bf2-aa6b-31ae2585cace" containerName="nova-cell1-conductor-db-sync" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.373128 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f302eb-c586-4bf2-aa6b-31ae2585cace" containerName="nova-cell1-conductor-db-sync" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.374809 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.381704 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.389058 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.497079 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.497182 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thhnf\" (UniqueName: \"kubernetes.io/projected/047f6c42-6806-4dde-b829-7748a2560e2a-kube-api-access-thhnf\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.497232 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.599920 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thhnf\" (UniqueName: \"kubernetes.io/projected/047f6c42-6806-4dde-b829-7748a2560e2a-kube-api-access-thhnf\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.600025 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.600283 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.605230 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.607947 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.616457 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thhnf\" (UniqueName: \"kubernetes.io/projected/047f6c42-6806-4dde-b829-7748a2560e2a-kube-api-access-thhnf\") pod \"nova-cell1-conductor-0\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.682961 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.713071 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.803786 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-combined-ca-bundle\") pod \"d2766af4-5f69-4efa-9f3d-0899a146114a\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.803874 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-config-data\") pod \"d2766af4-5f69-4efa-9f3d-0899a146114a\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.803931 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5x4g\" (UniqueName: \"kubernetes.io/projected/d2766af4-5f69-4efa-9f3d-0899a146114a-kube-api-access-w5x4g\") pod \"d2766af4-5f69-4efa-9f3d-0899a146114a\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.803953 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-scripts\") pod \"d2766af4-5f69-4efa-9f3d-0899a146114a\" (UID: \"d2766af4-5f69-4efa-9f3d-0899a146114a\") " Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.808966 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-scripts" (OuterVolumeSpecName: "scripts") pod "d2766af4-5f69-4efa-9f3d-0899a146114a" (UID: "d2766af4-5f69-4efa-9f3d-0899a146114a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.809805 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2766af4-5f69-4efa-9f3d-0899a146114a-kube-api-access-w5x4g" (OuterVolumeSpecName: "kube-api-access-w5x4g") pod "d2766af4-5f69-4efa-9f3d-0899a146114a" (UID: "d2766af4-5f69-4efa-9f3d-0899a146114a"). InnerVolumeSpecName "kube-api-access-w5x4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.832982 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2766af4-5f69-4efa-9f3d-0899a146114a" (UID: "d2766af4-5f69-4efa-9f3d-0899a146114a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.861628 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-config-data" (OuterVolumeSpecName: "config-data") pod "d2766af4-5f69-4efa-9f3d-0899a146114a" (UID: "d2766af4-5f69-4efa-9f3d-0899a146114a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.908497 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.908541 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5x4g\" (UniqueName: \"kubernetes.io/projected/d2766af4-5f69-4efa-9f3d-0899a146114a-kube-api-access-w5x4g\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.908552 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:36 crc kubenswrapper[4990]: I1003 11:22:36.908561 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2766af4-5f69-4efa-9f3d-0899a146114a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.214194 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 11:22:37 crc kubenswrapper[4990]: W1003 11:22:37.223781 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod047f6c42_6806_4dde_b829_7748a2560e2a.slice/crio-b6841bb5358cb843fd785b894a0f84330190e18075c0a338fbba068186d5affb WatchSource:0}: Error finding container b6841bb5358cb843fd785b894a0f84330190e18075c0a338fbba068186d5affb: Status 404 returned error can't find the container with id b6841bb5358cb843fd785b894a0f84330190e18075c0a338fbba068186d5affb Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.268981 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2vjkx" event={"ID":"d2766af4-5f69-4efa-9f3d-0899a146114a","Type":"ContainerDied","Data":"715c90e873b1ac11911fbb600c826557ece22f39df1775b325592baafee09749"} Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.269027 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="715c90e873b1ac11911fbb600c826557ece22f39df1775b325592baafee09749" Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.269041 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2vjkx" Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.270189 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"047f6c42-6806-4dde-b829-7748a2560e2a","Type":"ContainerStarted","Data":"b6841bb5358cb843fd785b894a0f84330190e18075c0a338fbba068186d5affb"} Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.492604 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.493579 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" containerName="nova-api-log" containerID="cri-o://fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552" gracePeriod=30 Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.493764 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" containerName="nova-api-api" containerID="cri-o://4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e" gracePeriod=30 Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.514488 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.519622 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c29e3677-3e1b-48e4-91f5-13d1c959732d" containerName="nova-scheduler-scheduler" containerID="cri-o://799c1984d3983906894a732c56ac70f6875f796b71619e0e5c76c2911d2cce2b" gracePeriod=30 Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.533498 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.533849 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" containerName="nova-metadata-log" containerID="cri-o://12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2" gracePeriod=30 Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.533954 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" containerName="nova-metadata-metadata" containerID="cri-o://1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9" gracePeriod=30 Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.874219 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:22:37 crc kubenswrapper[4990]: E1003 11:22:37.874915 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:22:37 crc kubenswrapper[4990]: I1003 11:22:37.975392 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.029217 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-config-data\") pod \"e763f685-b474-4669-a91b-ad58f30fbe57\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.029312 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmhgt\" (UniqueName: \"kubernetes.io/projected/e763f685-b474-4669-a91b-ad58f30fbe57-kube-api-access-bmhgt\") pod \"e763f685-b474-4669-a91b-ad58f30fbe57\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.029381 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e763f685-b474-4669-a91b-ad58f30fbe57-logs\") pod \"e763f685-b474-4669-a91b-ad58f30fbe57\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.029473 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-combined-ca-bundle\") pod \"e763f685-b474-4669-a91b-ad58f30fbe57\" (UID: \"e763f685-b474-4669-a91b-ad58f30fbe57\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.030051 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e763f685-b474-4669-a91b-ad58f30fbe57-logs" (OuterVolumeSpecName: "logs") pod "e763f685-b474-4669-a91b-ad58f30fbe57" (UID: "e763f685-b474-4669-a91b-ad58f30fbe57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.037272 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e763f685-b474-4669-a91b-ad58f30fbe57-kube-api-access-bmhgt" (OuterVolumeSpecName: "kube-api-access-bmhgt") pod "e763f685-b474-4669-a91b-ad58f30fbe57" (UID: "e763f685-b474-4669-a91b-ad58f30fbe57"). InnerVolumeSpecName "kube-api-access-bmhgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.063647 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e763f685-b474-4669-a91b-ad58f30fbe57" (UID: "e763f685-b474-4669-a91b-ad58f30fbe57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.064286 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-config-data" (OuterVolumeSpecName: "config-data") pod "e763f685-b474-4669-a91b-ad58f30fbe57" (UID: "e763f685-b474-4669-a91b-ad58f30fbe57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.130936 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.130972 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmhgt\" (UniqueName: \"kubernetes.io/projected/e763f685-b474-4669-a91b-ad58f30fbe57-kube-api-access-bmhgt\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.130982 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e763f685-b474-4669-a91b-ad58f30fbe57-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.130991 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e763f685-b474-4669-a91b-ad58f30fbe57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.139853 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.281708 4990 generic.go:334] "Generic (PLEG): container finished" podID="728ddc87-15e1-43d4-9856-51da796df9c8" containerID="1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9" exitCode=0 Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.281966 4990 generic.go:334] "Generic (PLEG): container finished" podID="728ddc87-15e1-43d4-9856-51da796df9c8" containerID="12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2" exitCode=143 Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.281777 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"728ddc87-15e1-43d4-9856-51da796df9c8","Type":"ContainerDied","Data":"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9"} Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.281798 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.282038 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"728ddc87-15e1-43d4-9856-51da796df9c8","Type":"ContainerDied","Data":"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2"} Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.282063 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"728ddc87-15e1-43d4-9856-51da796df9c8","Type":"ContainerDied","Data":"c88d95402974a3982596eaba8da1b01a124c870c75eceb0a86c9d25fe2b5c73b"} Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.282082 4990 scope.go:117] "RemoveContainer" containerID="1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.285306 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"047f6c42-6806-4dde-b829-7748a2560e2a","Type":"ContainerStarted","Data":"80ff68c4a3e79b5d970f66d8b340c581812ccd94c0e23b62621d70e3ce5516d4"} Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.285545 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.287669 4990 generic.go:334] "Generic (PLEG): container finished" podID="e763f685-b474-4669-a91b-ad58f30fbe57" containerID="4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e" exitCode=0 Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.287691 4990 generic.go:334] "Generic (PLEG): container finished" podID="e763f685-b474-4669-a91b-ad58f30fbe57" containerID="fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552" exitCode=143 Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.287720 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e763f685-b474-4669-a91b-ad58f30fbe57","Type":"ContainerDied","Data":"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e"} Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.287742 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e763f685-b474-4669-a91b-ad58f30fbe57","Type":"ContainerDied","Data":"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552"} Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.287757 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e763f685-b474-4669-a91b-ad58f30fbe57","Type":"ContainerDied","Data":"05df0f36124b2d1df217d3612a984aaef8daf89e26c1f671bdeb3cc0aab2e50c"} Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.287805 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.310732 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.310712881 podStartE2EDuration="2.310712881s" podCreationTimestamp="2025-10-03 11:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:38.303740101 +0000 UTC m=+5940.100371958" watchObservedRunningTime="2025-10-03 11:22:38.310712881 +0000 UTC m=+5940.107344748" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.311694 4990 scope.go:117] "RemoveContainer" containerID="12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.330576 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.335190 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-nova-metadata-tls-certs\") pod \"728ddc87-15e1-43d4-9856-51da796df9c8\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.335275 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728ddc87-15e1-43d4-9856-51da796df9c8-logs\") pod \"728ddc87-15e1-43d4-9856-51da796df9c8\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.335399 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bxs5\" (UniqueName: \"kubernetes.io/projected/728ddc87-15e1-43d4-9856-51da796df9c8-kube-api-access-4bxs5\") pod \"728ddc87-15e1-43d4-9856-51da796df9c8\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.335445 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-config-data\") pod \"728ddc87-15e1-43d4-9856-51da796df9c8\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.335501 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-combined-ca-bundle\") pod \"728ddc87-15e1-43d4-9856-51da796df9c8\" (UID: \"728ddc87-15e1-43d4-9856-51da796df9c8\") " Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.347411 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/728ddc87-15e1-43d4-9856-51da796df9c8-logs" (OuterVolumeSpecName: "logs") pod "728ddc87-15e1-43d4-9856-51da796df9c8" (UID: "728ddc87-15e1-43d4-9856-51da796df9c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.347567 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.349668 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728ddc87-15e1-43d4-9856-51da796df9c8-kube-api-access-4bxs5" (OuterVolumeSpecName: "kube-api-access-4bxs5") pod "728ddc87-15e1-43d4-9856-51da796df9c8" (UID: "728ddc87-15e1-43d4-9856-51da796df9c8"). InnerVolumeSpecName "kube-api-access-4bxs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.353443 4990 scope.go:117] "RemoveContainer" containerID="1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9" Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.354384 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9\": container with ID starting with 1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9 not found: ID does not exist" containerID="1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.354424 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9"} err="failed to get container status \"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9\": rpc error: code = NotFound desc = could not find container \"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9\": container with ID starting with 1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9 not found: ID does not exist" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.354451 4990 scope.go:117] "RemoveContainer" containerID="12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2" Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.355353 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2\": container with ID starting with 12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2 not found: ID does not exist" containerID="12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.355430 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2"} err="failed to get container status \"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2\": rpc error: code = NotFound desc = could not find container \"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2\": container with ID starting with 12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2 not found: ID does not exist" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.355464 4990 scope.go:117] "RemoveContainer" containerID="1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.356720 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9"} err="failed to get container status \"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9\": rpc error: code = NotFound desc = could not find container \"1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9\": container with ID starting with 1f0ed208e4e401e402f24e1540e4977148b634ab99ed8ae8261bc7ba9460f3e9 not found: ID does not exist" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.356747 4990 scope.go:117] "RemoveContainer" containerID="12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.357482 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2"} err="failed to get container status \"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2\": rpc error: code = NotFound desc = could not find container \"12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2\": container with ID starting with 12a9886e995e99b154f1544f2ac6b77bd07323acfbf018466acfab17a7a4bda2 not found: ID does not exist" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.357533 4990 scope.go:117] "RemoveContainer" containerID="4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.383395 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "728ddc87-15e1-43d4-9856-51da796df9c8" (UID: "728ddc87-15e1-43d4-9856-51da796df9c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.384566 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.385497 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" containerName="nova-api-api" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.385544 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" containerName="nova-api-api" Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.385587 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2766af4-5f69-4efa-9f3d-0899a146114a" containerName="nova-manage" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.385596 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2766af4-5f69-4efa-9f3d-0899a146114a" containerName="nova-manage" Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.385612 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" containerName="nova-metadata-metadata" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.385625 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" containerName="nova-metadata-metadata" Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.385643 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" containerName="nova-metadata-log" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.385652 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" containerName="nova-metadata-log" Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.385669 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" containerName="nova-api-log" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.385676 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" containerName="nova-api-log" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.386138 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" containerName="nova-api-log" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.386182 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" containerName="nova-metadata-metadata" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.386193 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" containerName="nova-api-api" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.386218 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2766af4-5f69-4efa-9f3d-0899a146114a" containerName="nova-manage" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.386246 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" containerName="nova-metadata-log" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.388402 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.391002 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.393478 4990 scope.go:117] "RemoveContainer" containerID="fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.397580 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-config-data" (OuterVolumeSpecName: "config-data") pod "728ddc87-15e1-43d4-9856-51da796df9c8" (UID: "728ddc87-15e1-43d4-9856-51da796df9c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.406348 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.418206 4990 scope.go:117] "RemoveContainer" containerID="4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e" Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.418631 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e\": container with ID starting with 4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e not found: ID does not exist" containerID="4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.418661 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e"} err="failed to get container status \"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e\": rpc error: code = NotFound desc = could not find container \"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e\": container with ID starting with 4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e not found: ID does not exist" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.418686 4990 scope.go:117] "RemoveContainer" containerID="fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552" Oct 03 11:22:38 crc kubenswrapper[4990]: E1003 11:22:38.418996 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552\": container with ID starting with fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552 not found: ID does not exist" containerID="fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.419025 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552"} err="failed to get container status \"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552\": rpc error: code = NotFound desc = could not find container \"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552\": container with ID starting with fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552 not found: ID does not exist" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.419045 4990 scope.go:117] "RemoveContainer" containerID="4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.419247 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e"} err="failed to get container status \"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e\": rpc error: code = NotFound desc = could not find container \"4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e\": container with ID starting with 4d4eba3251b38f3828944845a8866c352205c6044a01d32927b5dd105bffed3e not found: ID does not exist" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.419272 4990 scope.go:117] "RemoveContainer" containerID="fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.419699 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552"} err="failed to get container status \"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552\": rpc error: code = NotFound desc = could not find container \"fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552\": container with ID starting with fbaab924171119f0a12597a6dbb8c73648d7371ae28f3a43e02ffe8243b1b552 not found: ID does not exist" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.441866 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "728ddc87-15e1-43d4-9856-51da796df9c8" (UID: "728ddc87-15e1-43d4-9856-51da796df9c8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447326 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b154ab54-3f06-490f-93b8-658ec26ed0e8-logs\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447461 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-config-data\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447566 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447612 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpgvq\" (UniqueName: \"kubernetes.io/projected/b154ab54-3f06-490f-93b8-658ec26ed0e8-kube-api-access-kpgvq\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447735 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bxs5\" (UniqueName: \"kubernetes.io/projected/728ddc87-15e1-43d4-9856-51da796df9c8-kube-api-access-4bxs5\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447758 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447771 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447781 4990 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/728ddc87-15e1-43d4-9856-51da796df9c8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.447789 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/728ddc87-15e1-43d4-9856-51da796df9c8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.549738 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b154ab54-3f06-490f-93b8-658ec26ed0e8-logs\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.549841 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-config-data\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.549891 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.549925 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpgvq\" (UniqueName: \"kubernetes.io/projected/b154ab54-3f06-490f-93b8-658ec26ed0e8-kube-api-access-kpgvq\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.550132 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b154ab54-3f06-490f-93b8-658ec26ed0e8-logs\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.554444 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-config-data\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.554615 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.571766 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpgvq\" (UniqueName: \"kubernetes.io/projected/b154ab54-3f06-490f-93b8-658ec26ed0e8-kube-api-access-kpgvq\") pod \"nova-api-0\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.695010 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.708250 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.713633 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.721495 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.741934 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.743863 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.745868 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.746147 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.752611 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.754040 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.754078 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-config-data\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.754146 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p576m\" (UniqueName: \"kubernetes.io/projected/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-kube-api-access-p576m\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.754217 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-logs\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.754245 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.820559 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b665ddf7-pgx56"] Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.821131 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" podUID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" containerName="dnsmasq-dns" containerID="cri-o://66fa367ce4241e874fc3d82bb892eb52cb9b9d7ee0de59fa2b264d09a3b1f742" gracePeriod=10 Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.859602 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-logs\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.859652 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.859728 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.859743 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-config-data\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.859809 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p576m\" (UniqueName: \"kubernetes.io/projected/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-kube-api-access-p576m\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.861464 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-logs\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.865987 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.866274 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.866664 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-config-data\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.888713 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p576m\" (UniqueName: \"kubernetes.io/projected/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-kube-api-access-p576m\") pod \"nova-metadata-0\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " pod="openstack/nova-metadata-0" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.892743 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="728ddc87-15e1-43d4-9856-51da796df9c8" path="/var/lib/kubelet/pods/728ddc87-15e1-43d4-9856-51da796df9c8/volumes" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.893712 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e763f685-b474-4669-a91b-ad58f30fbe57" path="/var/lib/kubelet/pods/e763f685-b474-4669-a91b-ad58f30fbe57/volumes" Oct 03 11:22:38 crc kubenswrapper[4990]: I1003 11:22:38.950377 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.155034 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.271617 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.303680 4990 generic.go:334] "Generic (PLEG): container finished" podID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" containerID="66fa367ce4241e874fc3d82bb892eb52cb9b9d7ee0de59fa2b264d09a3b1f742" exitCode=0 Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.303752 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" event={"ID":"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4","Type":"ContainerDied","Data":"66fa367ce4241e874fc3d82bb892eb52cb9b9d7ee0de59fa2b264d09a3b1f742"} Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.385572 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.481526 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-nb\") pod \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.481708 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-sb\") pod \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.481877 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-dns-svc\") pod \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.481988 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79fkv\" (UniqueName: \"kubernetes.io/projected/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-kube-api-access-79fkv\") pod \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.482130 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-config\") pod \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\" (UID: \"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4\") " Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.489835 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-kube-api-access-79fkv" (OuterVolumeSpecName: "kube-api-access-79fkv") pod "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" (UID: "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4"). InnerVolumeSpecName "kube-api-access-79fkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.539177 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" (UID: "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.556095 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-config" (OuterVolumeSpecName: "config") pod "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" (UID: "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.577313 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" (UID: "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.584600 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.584626 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79fkv\" (UniqueName: \"kubernetes.io/projected/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-kube-api-access-79fkv\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.584638 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.584646 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.591303 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" (UID: "ecf7bdc4-ef3a-4378-9612-ecfcac3605d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.686674 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:39 crc kubenswrapper[4990]: I1003 11:22:39.731991 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.322586 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b154ab54-3f06-490f-93b8-658ec26ed0e8","Type":"ContainerStarted","Data":"c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147"} Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.322909 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b154ab54-3f06-490f-93b8-658ec26ed0e8","Type":"ContainerStarted","Data":"2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0"} Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.322921 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b154ab54-3f06-490f-93b8-658ec26ed0e8","Type":"ContainerStarted","Data":"a346199ddefe028dace5b52b422fef4c7a90c822c54a26da98fbc4c77561a036"} Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.326590 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4","Type":"ContainerStarted","Data":"1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931"} Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.326626 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4","Type":"ContainerStarted","Data":"6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736"} Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.326641 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4","Type":"ContainerStarted","Data":"5a0361f91ea5c5c0e41e6825198206a004e7de586afdcce5366f0783a572fa41"} Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.329384 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" event={"ID":"ecf7bdc4-ef3a-4378-9612-ecfcac3605d4","Type":"ContainerDied","Data":"194dd50dc27e9bee9b38d644372246d1e37fbed958fa48cebd779e2e952de013"} Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.329423 4990 scope.go:117] "RemoveContainer" containerID="66fa367ce4241e874fc3d82bb892eb52cb9b9d7ee0de59fa2b264d09a3b1f742" Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.329531 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b665ddf7-pgx56" Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.347942 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.347924729 podStartE2EDuration="2.347924729s" podCreationTimestamp="2025-10-03 11:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:40.337654594 +0000 UTC m=+5942.134286451" watchObservedRunningTime="2025-10-03 11:22:40.347924729 +0000 UTC m=+5942.144556586" Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.369159 4990 scope.go:117] "RemoveContainer" containerID="58ab21ca6e002bbc8bc0418152fe826d5f6645e778d95b00151af77c37d63aa3" Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.379736 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.379713719 podStartE2EDuration="2.379713719s" podCreationTimestamp="2025-10-03 11:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:40.35842143 +0000 UTC m=+5942.155053297" watchObservedRunningTime="2025-10-03 11:22:40.379713719 +0000 UTC m=+5942.176345586" Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.412027 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b665ddf7-pgx56"] Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.421675 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b665ddf7-pgx56"] Oct 03 11:22:40 crc kubenswrapper[4990]: I1003 11:22:40.886463 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" path="/var/lib/kubelet/pods/ecf7bdc4-ef3a-4378-9612-ecfcac3605d4/volumes" Oct 03 11:22:43 crc kubenswrapper[4990]: I1003 11:22:43.949911 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:43 crc kubenswrapper[4990]: I1003 11:22:43.977274 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:44 crc kubenswrapper[4990]: I1003 11:22:44.157094 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 11:22:44 crc kubenswrapper[4990]: I1003 11:22:44.157161 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 11:22:44 crc kubenswrapper[4990]: I1003 11:22:44.398924 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 11:22:46 crc kubenswrapper[4990]: I1003 11:22:46.769013 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.330940 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5c5lk"] Oct 03 11:22:47 crc kubenswrapper[4990]: E1003 11:22:47.331432 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" containerName="dnsmasq-dns" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.331446 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" containerName="dnsmasq-dns" Oct 03 11:22:47 crc kubenswrapper[4990]: E1003 11:22:47.331483 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" containerName="init" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.331493 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" containerName="init" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.331772 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf7bdc4-ef3a-4378-9612-ecfcac3605d4" containerName="dnsmasq-dns" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.332639 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.336830 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.337348 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.349134 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5c5lk"] Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.401911 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gzjp\" (UniqueName: \"kubernetes.io/projected/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-kube-api-access-4gzjp\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.402048 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.402153 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-config-data\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.402333 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-scripts\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.504312 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.504378 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-config-data\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.504430 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-scripts\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.504476 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gzjp\" (UniqueName: \"kubernetes.io/projected/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-kube-api-access-4gzjp\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.513252 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.515083 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-config-data\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.521001 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-scripts\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.526038 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gzjp\" (UniqueName: \"kubernetes.io/projected/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-kube-api-access-4gzjp\") pod \"nova-cell1-cell-mapping-5c5lk\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.662438 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:47 crc kubenswrapper[4990]: I1003 11:22:47.924627 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5c5lk"] Oct 03 11:22:48 crc kubenswrapper[4990]: I1003 11:22:48.415682 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5c5lk" event={"ID":"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1","Type":"ContainerStarted","Data":"4bbfd6d2beded0d50b2894277f97d34264bca25424bd54c7a58dd5c32237b2be"} Oct 03 11:22:48 crc kubenswrapper[4990]: I1003 11:22:48.416035 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5c5lk" event={"ID":"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1","Type":"ContainerStarted","Data":"8e242294d78e102f4ba16181bad3b4db2752a35f8dc5856207e5f8defdce2609"} Oct 03 11:22:48 crc kubenswrapper[4990]: I1003 11:22:48.441891 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5c5lk" podStartSLOduration=1.441872952 podStartE2EDuration="1.441872952s" podCreationTimestamp="2025-10-03 11:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:22:48.435666102 +0000 UTC m=+5950.232297969" watchObservedRunningTime="2025-10-03 11:22:48.441872952 +0000 UTC m=+5950.238504819" Oct 03 11:22:48 crc kubenswrapper[4990]: I1003 11:22:48.714444 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 11:22:48 crc kubenswrapper[4990]: I1003 11:22:48.714494 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 11:22:49 crc kubenswrapper[4990]: I1003 11:22:49.157030 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 11:22:49 crc kubenswrapper[4990]: I1003 11:22:49.157378 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 11:22:49 crc kubenswrapper[4990]: I1003 11:22:49.162484 4990 scope.go:117] "RemoveContainer" containerID="00975977b70c14e762c13a40ea1be06edca709faa936d6968528eb1b397359ee" Oct 03 11:22:49 crc kubenswrapper[4990]: I1003 11:22:49.196231 4990 scope.go:117] "RemoveContainer" containerID="60925d16550e1f42fd4f9244844d930d27a5bf3a2967905822bb9a501f4b819a" Oct 03 11:22:49 crc kubenswrapper[4990]: I1003 11:22:49.797922 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.90:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 11:22:49 crc kubenswrapper[4990]: I1003 11:22:49.798419 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.90:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 11:22:50 crc kubenswrapper[4990]: I1003 11:22:50.167328 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.91:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 11:22:50 crc kubenswrapper[4990]: I1003 11:22:50.167661 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.91:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 11:22:51 crc kubenswrapper[4990]: I1003 11:22:51.872631 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:22:51 crc kubenswrapper[4990]: E1003 11:22:51.873828 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:22:53 crc kubenswrapper[4990]: I1003 11:22:53.463381 4990 generic.go:334] "Generic (PLEG): container finished" podID="92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" containerID="4bbfd6d2beded0d50b2894277f97d34264bca25424bd54c7a58dd5c32237b2be" exitCode=0 Oct 03 11:22:53 crc kubenswrapper[4990]: I1003 11:22:53.463450 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5c5lk" event={"ID":"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1","Type":"ContainerDied","Data":"4bbfd6d2beded0d50b2894277f97d34264bca25424bd54c7a58dd5c32237b2be"} Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.863140 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.877873 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-combined-ca-bundle\") pod \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.878021 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-scripts\") pod \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.878105 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gzjp\" (UniqueName: \"kubernetes.io/projected/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-kube-api-access-4gzjp\") pod \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.878353 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-config-data\") pod \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\" (UID: \"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1\") " Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.883392 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-scripts" (OuterVolumeSpecName: "scripts") pod "92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" (UID: "92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.884183 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-kube-api-access-4gzjp" (OuterVolumeSpecName: "kube-api-access-4gzjp") pod "92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" (UID: "92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1"). InnerVolumeSpecName "kube-api-access-4gzjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.915453 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" (UID: "92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.936752 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-config-data" (OuterVolumeSpecName: "config-data") pod "92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" (UID: "92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.981208 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.981242 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.981257 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gzjp\" (UniqueName: \"kubernetes.io/projected/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-kube-api-access-4gzjp\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:54 crc kubenswrapper[4990]: I1003 11:22:54.981269 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.488968 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5c5lk" event={"ID":"92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1","Type":"ContainerDied","Data":"8e242294d78e102f4ba16181bad3b4db2752a35f8dc5856207e5f8defdce2609"} Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.489017 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e242294d78e102f4ba16181bad3b4db2752a35f8dc5856207e5f8defdce2609" Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.489094 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5c5lk" Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.681232 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.682329 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-log" containerID="cri-o://2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0" gracePeriod=30 Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.682386 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-api" containerID="cri-o://c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147" gracePeriod=30 Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.729708 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.729968 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-log" containerID="cri-o://6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736" gracePeriod=30 Oct 03 11:22:55 crc kubenswrapper[4990]: I1003 11:22:55.730112 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-metadata" containerID="cri-o://1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931" gracePeriod=30 Oct 03 11:22:56 crc kubenswrapper[4990]: I1003 11:22:56.502134 4990 generic.go:334] "Generic (PLEG): container finished" podID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerID="2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0" exitCode=143 Oct 03 11:22:56 crc kubenswrapper[4990]: I1003 11:22:56.502231 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b154ab54-3f06-490f-93b8-658ec26ed0e8","Type":"ContainerDied","Data":"2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0"} Oct 03 11:22:56 crc kubenswrapper[4990]: I1003 11:22:56.504630 4990 generic.go:334] "Generic (PLEG): container finished" podID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerID="6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736" exitCode=143 Oct 03 11:22:56 crc kubenswrapper[4990]: I1003 11:22:56.504659 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4","Type":"ContainerDied","Data":"6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736"} Oct 03 11:23:02 crc kubenswrapper[4990]: I1003 11:23:02.873995 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:23:02 crc kubenswrapper[4990]: E1003 11:23:02.875202 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:23:07 crc kubenswrapper[4990]: I1003 11:23:07.619441 4990 generic.go:334] "Generic (PLEG): container finished" podID="c29e3677-3e1b-48e4-91f5-13d1c959732d" containerID="799c1984d3983906894a732c56ac70f6875f796b71619e0e5c76c2911d2cce2b" exitCode=137 Oct 03 11:23:07 crc kubenswrapper[4990]: I1003 11:23:07.619553 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c29e3677-3e1b-48e4-91f5-13d1c959732d","Type":"ContainerDied","Data":"799c1984d3983906894a732c56ac70f6875f796b71619e0e5c76c2911d2cce2b"} Oct 03 11:23:07 crc kubenswrapper[4990]: I1003 11:23:07.964648 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.068715 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-config-data\") pod \"c29e3677-3e1b-48e4-91f5-13d1c959732d\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.069777 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-combined-ca-bundle\") pod \"c29e3677-3e1b-48e4-91f5-13d1c959732d\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.069870 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx5qb\" (UniqueName: \"kubernetes.io/projected/c29e3677-3e1b-48e4-91f5-13d1c959732d-kube-api-access-cx5qb\") pod \"c29e3677-3e1b-48e4-91f5-13d1c959732d\" (UID: \"c29e3677-3e1b-48e4-91f5-13d1c959732d\") " Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.074784 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29e3677-3e1b-48e4-91f5-13d1c959732d-kube-api-access-cx5qb" (OuterVolumeSpecName: "kube-api-access-cx5qb") pod "c29e3677-3e1b-48e4-91f5-13d1c959732d" (UID: "c29e3677-3e1b-48e4-91f5-13d1c959732d"). InnerVolumeSpecName "kube-api-access-cx5qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.095365 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-config-data" (OuterVolumeSpecName: "config-data") pod "c29e3677-3e1b-48e4-91f5-13d1c959732d" (UID: "c29e3677-3e1b-48e4-91f5-13d1c959732d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.095843 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c29e3677-3e1b-48e4-91f5-13d1c959732d" (UID: "c29e3677-3e1b-48e4-91f5-13d1c959732d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.172069 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.172116 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29e3677-3e1b-48e4-91f5-13d1c959732d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.172130 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx5qb\" (UniqueName: \"kubernetes.io/projected/c29e3677-3e1b-48e4-91f5-13d1c959732d-kube-api-access-cx5qb\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.632641 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c29e3677-3e1b-48e4-91f5-13d1c959732d","Type":"ContainerDied","Data":"91f2c82899ddf2c88e6a06c4a469b29a41c64d896b566a5286b0238758c4d521"} Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.632696 4990 scope.go:117] "RemoveContainer" containerID="799c1984d3983906894a732c56ac70f6875f796b71619e0e5c76c2911d2cce2b" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.632723 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.697246 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.714555 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.714612 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.726001 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.741158 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:23:08 crc kubenswrapper[4990]: E1003 11:23:08.741849 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" containerName="nova-manage" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.741873 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" containerName="nova-manage" Oct 03 11:23:08 crc kubenswrapper[4990]: E1003 11:23:08.741912 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29e3677-3e1b-48e4-91f5-13d1c959732d" containerName="nova-scheduler-scheduler" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.741922 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29e3677-3e1b-48e4-91f5-13d1c959732d" containerName="nova-scheduler-scheduler" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.742179 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29e3677-3e1b-48e4-91f5-13d1c959732d" containerName="nova-scheduler-scheduler" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.742202 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" containerName="nova-manage" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.743074 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.746040 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.750905 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.883068 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.883437 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-config-data\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.883472 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65c46\" (UniqueName: \"kubernetes.io/projected/f631f859-cbf9-48c9-9555-147dcce70e07-kube-api-access-65c46\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.883576 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29e3677-3e1b-48e4-91f5-13d1c959732d" path="/var/lib/kubelet/pods/c29e3677-3e1b-48e4-91f5-13d1c959732d/volumes" Oct 03 11:23:08 crc kubenswrapper[4990]: E1003 11:23:08.888480 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc29e3677_3e1b_48e4_91f5_13d1c959732d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc29e3677_3e1b_48e4_91f5_13d1c959732d.slice/crio-91f2c82899ddf2c88e6a06c4a469b29a41c64d896b566a5286b0238758c4d521\": RecentStats: unable to find data in memory cache]" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.985959 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.986044 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-config-data\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.986067 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65c46\" (UniqueName: \"kubernetes.io/projected/f631f859-cbf9-48c9-9555-147dcce70e07-kube-api-access-65c46\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.990454 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-config-data\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:08 crc kubenswrapper[4990]: I1003 11:23:08.996774 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.016417 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65c46\" (UniqueName: \"kubernetes.io/projected/f631f859-cbf9-48c9-9555-147dcce70e07-kube-api-access-65c46\") pod \"nova-scheduler-0\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " pod="openstack/nova-scheduler-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.063640 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.481372 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.504200 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.532302 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596237 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b154ab54-3f06-490f-93b8-658ec26ed0e8-logs\") pod \"b154ab54-3f06-490f-93b8-658ec26ed0e8\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596281 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-logs\") pod \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596325 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpgvq\" (UniqueName: \"kubernetes.io/projected/b154ab54-3f06-490f-93b8-658ec26ed0e8-kube-api-access-kpgvq\") pod \"b154ab54-3f06-490f-93b8-658ec26ed0e8\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596437 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-combined-ca-bundle\") pod \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596523 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-nova-metadata-tls-certs\") pod \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596550 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-combined-ca-bundle\") pod \"b154ab54-3f06-490f-93b8-658ec26ed0e8\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596594 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p576m\" (UniqueName: \"kubernetes.io/projected/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-kube-api-access-p576m\") pod \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596620 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-config-data\") pod \"b154ab54-3f06-490f-93b8-658ec26ed0e8\" (UID: \"b154ab54-3f06-490f-93b8-658ec26ed0e8\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596647 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-config-data\") pod \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\" (UID: \"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4\") " Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.596893 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b154ab54-3f06-490f-93b8-658ec26ed0e8-logs" (OuterVolumeSpecName: "logs") pod "b154ab54-3f06-490f-93b8-658ec26ed0e8" (UID: "b154ab54-3f06-490f-93b8-658ec26ed0e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.597313 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b154ab54-3f06-490f-93b8-658ec26ed0e8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.597936 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-logs" (OuterVolumeSpecName: "logs") pod "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" (UID: "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.600577 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b154ab54-3f06-490f-93b8-658ec26ed0e8-kube-api-access-kpgvq" (OuterVolumeSpecName: "kube-api-access-kpgvq") pod "b154ab54-3f06-490f-93b8-658ec26ed0e8" (UID: "b154ab54-3f06-490f-93b8-658ec26ed0e8"). InnerVolumeSpecName "kube-api-access-kpgvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.600648 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-kube-api-access-p576m" (OuterVolumeSpecName: "kube-api-access-p576m") pod "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" (UID: "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4"). InnerVolumeSpecName "kube-api-access-p576m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.623960 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-config-data" (OuterVolumeSpecName: "config-data") pod "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" (UID: "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.626382 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" (UID: "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.626582 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b154ab54-3f06-490f-93b8-658ec26ed0e8" (UID: "b154ab54-3f06-490f-93b8-658ec26ed0e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.626817 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-config-data" (OuterVolumeSpecName: "config-data") pod "b154ab54-3f06-490f-93b8-658ec26ed0e8" (UID: "b154ab54-3f06-490f-93b8-658ec26ed0e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.641923 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f631f859-cbf9-48c9-9555-147dcce70e07","Type":"ContainerStarted","Data":"11fccaecfb7ebd40bc1c94f8e312909e9c1051d0855e4ccfa625a5024c8280e5"} Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.646430 4990 generic.go:334] "Generic (PLEG): container finished" podID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerID="c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147" exitCode=0 Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.646555 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b154ab54-3f06-490f-93b8-658ec26ed0e8","Type":"ContainerDied","Data":"c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147"} Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.646669 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b154ab54-3f06-490f-93b8-658ec26ed0e8","Type":"ContainerDied","Data":"a346199ddefe028dace5b52b422fef4c7a90c822c54a26da98fbc4c77561a036"} Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.646684 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.646702 4990 scope.go:117] "RemoveContainer" containerID="c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.648984 4990 generic.go:334] "Generic (PLEG): container finished" podID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerID="1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931" exitCode=0 Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.649017 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4","Type":"ContainerDied","Data":"1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931"} Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.649043 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd8c5a21-89d3-442f-8ec5-d780d2fd22c4","Type":"ContainerDied","Data":"5a0361f91ea5c5c0e41e6825198206a004e7de586afdcce5366f0783a572fa41"} Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.649107 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.664945 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" (UID: "cd8c5a21-89d3-442f-8ec5-d780d2fd22c4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.667432 4990 scope.go:117] "RemoveContainer" containerID="2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.690241 4990 scope.go:117] "RemoveContainer" containerID="c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147" Oct 03 11:23:09 crc kubenswrapper[4990]: E1003 11:23:09.690657 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147\": container with ID starting with c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147 not found: ID does not exist" containerID="c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.690719 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147"} err="failed to get container status \"c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147\": rpc error: code = NotFound desc = could not find container \"c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147\": container with ID starting with c0ed871f7f9065aab10aa5dc4cbae0e15412880d9629b54c8a4275d475f69147 not found: ID does not exist" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.690756 4990 scope.go:117] "RemoveContainer" containerID="2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0" Oct 03 11:23:09 crc kubenswrapper[4990]: E1003 11:23:09.691085 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0\": container with ID starting with 2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0 not found: ID does not exist" containerID="2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.691124 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0"} err="failed to get container status \"2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0\": rpc error: code = NotFound desc = could not find container \"2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0\": container with ID starting with 2b810b5bb42df1da6e324b2b591032735ac66e7e8d293dea75c9b81317d8d1d0 not found: ID does not exist" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.691153 4990 scope.go:117] "RemoveContainer" containerID="1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.696156 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.699734 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.699766 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpgvq\" (UniqueName: \"kubernetes.io/projected/b154ab54-3f06-490f-93b8-658ec26ed0e8-kube-api-access-kpgvq\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.699779 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.699815 4990 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.699825 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.699833 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p576m\" (UniqueName: \"kubernetes.io/projected/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-kube-api-access-p576m\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.699842 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b154ab54-3f06-490f-93b8-658ec26ed0e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.699850 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.705133 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.718847 4990 scope.go:117] "RemoveContainer" containerID="6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.720672 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:09 crc kubenswrapper[4990]: E1003 11:23:09.721191 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-metadata" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.721211 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-metadata" Oct 03 11:23:09 crc kubenswrapper[4990]: E1003 11:23:09.721231 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-api" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.721239 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-api" Oct 03 11:23:09 crc kubenswrapper[4990]: E1003 11:23:09.721263 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-log" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.721271 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-log" Oct 03 11:23:09 crc kubenswrapper[4990]: E1003 11:23:09.721298 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-log" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.721307 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-log" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.721558 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-log" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.721579 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-metadata" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.721598 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" containerName="nova-api-api" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.721628 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" containerName="nova-metadata-log" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.722999 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.726655 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.728767 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.759550 4990 scope.go:117] "RemoveContainer" containerID="1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931" Oct 03 11:23:09 crc kubenswrapper[4990]: E1003 11:23:09.760556 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931\": container with ID starting with 1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931 not found: ID does not exist" containerID="1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.760594 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931"} err="failed to get container status \"1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931\": rpc error: code = NotFound desc = could not find container \"1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931\": container with ID starting with 1fc8db11001ad53db732841bc9942d8f566f0dc21d7bb008be991a882da8c931 not found: ID does not exist" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.760620 4990 scope.go:117] "RemoveContainer" containerID="6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736" Oct 03 11:23:09 crc kubenswrapper[4990]: E1003 11:23:09.761010 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736\": container with ID starting with 6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736 not found: ID does not exist" containerID="6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.761096 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736"} err="failed to get container status \"6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736\": rpc error: code = NotFound desc = could not find container \"6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736\": container with ID starting with 6e042ff0c7572b48dd75ed2e8d841bac09b3ac0194bc3d304ac61bf567bf9736 not found: ID does not exist" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.903217 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-config-data\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.903583 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158bbddd-d906-4bed-85c9-e4587c424ee6-logs\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.903851 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hch9z\" (UniqueName: \"kubernetes.io/projected/158bbddd-d906-4bed-85c9-e4587c424ee6-kube-api-access-hch9z\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.904091 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:09 crc kubenswrapper[4990]: I1003 11:23:09.982886 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.003961 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.006387 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hch9z\" (UniqueName: \"kubernetes.io/projected/158bbddd-d906-4bed-85c9-e4587c424ee6-kube-api-access-hch9z\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.006453 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.008875 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-config-data\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.008924 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158bbddd-d906-4bed-85c9-e4587c424ee6-logs\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.009654 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158bbddd-d906-4bed-85c9-e4587c424ee6-logs\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.014834 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.022721 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.025045 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.027865 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hch9z\" (UniqueName: \"kubernetes.io/projected/158bbddd-d906-4bed-85c9-e4587c424ee6-kube-api-access-hch9z\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.028462 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.028731 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.030895 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-config-data\") pod \"nova-api-0\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.035190 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.051904 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.111262 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.111376 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f406cae-9e16-4ebf-8a12-4585b177eb9d-logs\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.111599 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.112168 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvllx\" (UniqueName: \"kubernetes.io/projected/9f406cae-9e16-4ebf-8a12-4585b177eb9d-kube-api-access-lvllx\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.112252 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-config-data\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.213713 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.213808 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvllx\" (UniqueName: \"kubernetes.io/projected/9f406cae-9e16-4ebf-8a12-4585b177eb9d-kube-api-access-lvllx\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.213855 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-config-data\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.213932 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.213979 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f406cae-9e16-4ebf-8a12-4585b177eb9d-logs\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.214345 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f406cae-9e16-4ebf-8a12-4585b177eb9d-logs\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.219294 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-config-data\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.223211 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.232669 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.234086 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvllx\" (UniqueName: \"kubernetes.io/projected/9f406cae-9e16-4ebf-8a12-4585b177eb9d-kube-api-access-lvllx\") pod \"nova-metadata-0\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.451528 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.509678 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.665363 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"158bbddd-d906-4bed-85c9-e4587c424ee6","Type":"ContainerStarted","Data":"d23415fa78599274c70232234deac112f00419f65847ed053546902e4f933a07"} Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.667909 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f631f859-cbf9-48c9-9555-147dcce70e07","Type":"ContainerStarted","Data":"e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13"} Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.694899 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.694879573 podStartE2EDuration="2.694879573s" podCreationTimestamp="2025-10-03 11:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:23:10.685967153 +0000 UTC m=+5972.482599020" watchObservedRunningTime="2025-10-03 11:23:10.694879573 +0000 UTC m=+5972.491511420" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.884478 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b154ab54-3f06-490f-93b8-658ec26ed0e8" path="/var/lib/kubelet/pods/b154ab54-3f06-490f-93b8-658ec26ed0e8/volumes" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.885212 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd8c5a21-89d3-442f-8ec5-d780d2fd22c4" path="/var/lib/kubelet/pods/cd8c5a21-89d3-442f-8ec5-d780d2fd22c4/volumes" Oct 03 11:23:10 crc kubenswrapper[4990]: I1003 11:23:10.953230 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 11:23:10 crc kubenswrapper[4990]: W1003 11:23:10.954903 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f406cae_9e16_4ebf_8a12_4585b177eb9d.slice/crio-49bde3b6aae47fcaca00ee2b2cf9cae9f828ed5ca5256cbb285864e22f801077 WatchSource:0}: Error finding container 49bde3b6aae47fcaca00ee2b2cf9cae9f828ed5ca5256cbb285864e22f801077: Status 404 returned error can't find the container with id 49bde3b6aae47fcaca00ee2b2cf9cae9f828ed5ca5256cbb285864e22f801077 Oct 03 11:23:11 crc kubenswrapper[4990]: I1003 11:23:11.681422 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"158bbddd-d906-4bed-85c9-e4587c424ee6","Type":"ContainerStarted","Data":"b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68"} Oct 03 11:23:11 crc kubenswrapper[4990]: I1003 11:23:11.681483 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"158bbddd-d906-4bed-85c9-e4587c424ee6","Type":"ContainerStarted","Data":"3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a"} Oct 03 11:23:11 crc kubenswrapper[4990]: I1003 11:23:11.684403 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f406cae-9e16-4ebf-8a12-4585b177eb9d","Type":"ContainerStarted","Data":"7a08f708df8c8f2b7ed6869a02da58608e553958961a8fa9b16e21ee7ec83184"} Oct 03 11:23:11 crc kubenswrapper[4990]: I1003 11:23:11.684445 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f406cae-9e16-4ebf-8a12-4585b177eb9d","Type":"ContainerStarted","Data":"a3f13f83759e227f9908ec6a6abbb9d6dc52019f21189f46ced9e01e967d412f"} Oct 03 11:23:11 crc kubenswrapper[4990]: I1003 11:23:11.684466 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f406cae-9e16-4ebf-8a12-4585b177eb9d","Type":"ContainerStarted","Data":"49bde3b6aae47fcaca00ee2b2cf9cae9f828ed5ca5256cbb285864e22f801077"} Oct 03 11:23:11 crc kubenswrapper[4990]: I1003 11:23:11.700580 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.700558644 podStartE2EDuration="2.700558644s" podCreationTimestamp="2025-10-03 11:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:23:11.6984751 +0000 UTC m=+5973.495106977" watchObservedRunningTime="2025-10-03 11:23:11.700558644 +0000 UTC m=+5973.497190511" Oct 03 11:23:11 crc kubenswrapper[4990]: I1003 11:23:11.737299 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.737280941 podStartE2EDuration="2.737280941s" podCreationTimestamp="2025-10-03 11:23:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:23:11.723546607 +0000 UTC m=+5973.520178484" watchObservedRunningTime="2025-10-03 11:23:11.737280941 +0000 UTC m=+5973.533912798" Oct 03 11:23:13 crc kubenswrapper[4990]: I1003 11:23:13.056894 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-s6sbq"] Oct 03 11:23:13 crc kubenswrapper[4990]: I1003 11:23:13.065187 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-s6sbq"] Oct 03 11:23:14 crc kubenswrapper[4990]: I1003 11:23:14.063990 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 11:23:14 crc kubenswrapper[4990]: I1003 11:23:14.890747 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b17e9d0c-0fd4-4836-bf80-8c5b24d601bf" path="/var/lib/kubelet/pods/b17e9d0c-0fd4-4836-bf80-8c5b24d601bf/volumes" Oct 03 11:23:15 crc kubenswrapper[4990]: I1003 11:23:15.452570 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 11:23:15 crc kubenswrapper[4990]: I1003 11:23:15.453568 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 11:23:15 crc kubenswrapper[4990]: I1003 11:23:15.873400 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:23:15 crc kubenswrapper[4990]: E1003 11:23:15.873914 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:23:19 crc kubenswrapper[4990]: I1003 11:23:19.063985 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 11:23:19 crc kubenswrapper[4990]: I1003 11:23:19.096643 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 11:23:19 crc kubenswrapper[4990]: I1003 11:23:19.827422 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 11:23:20 crc kubenswrapper[4990]: I1003 11:23:20.052792 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 11:23:20 crc kubenswrapper[4990]: I1003 11:23:20.053143 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 11:23:20 crc kubenswrapper[4990]: I1003 11:23:20.451891 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 11:23:20 crc kubenswrapper[4990]: I1003 11:23:20.451954 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 11:23:21 crc kubenswrapper[4990]: I1003 11:23:21.135843 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.94:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 11:23:21 crc kubenswrapper[4990]: I1003 11:23:21.135931 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.94:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 11:23:21 crc kubenswrapper[4990]: I1003 11:23:21.470770 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 11:23:21 crc kubenswrapper[4990]: I1003 11:23:21.471063 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 11:23:23 crc kubenswrapper[4990]: I1003 11:23:23.044172 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d7bb-account-create-8slwx"] Oct 03 11:23:23 crc kubenswrapper[4990]: I1003 11:23:23.056249 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d7bb-account-create-8slwx"] Oct 03 11:23:24 crc kubenswrapper[4990]: I1003 11:23:24.891096 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c41a7f-04c6-4f86-9502-b1e881ea0fb2" path="/var/lib/kubelet/pods/71c41a7f-04c6-4f86-9502-b1e881ea0fb2/volumes" Oct 03 11:23:28 crc kubenswrapper[4990]: I1003 11:23:28.885182 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:23:28 crc kubenswrapper[4990]: E1003 11:23:28.886744 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:23:29 crc kubenswrapper[4990]: I1003 11:23:29.062290 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xrc49"] Oct 03 11:23:29 crc kubenswrapper[4990]: I1003 11:23:29.071325 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xrc49"] Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.058645 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.059621 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.059964 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.068113 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.457858 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.459761 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.465169 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.881942 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7407cb10-f324-46b2-b231-966df1553a10" path="/var/lib/kubelet/pods/7407cb10-f324-46b2-b231-966df1553a10/volumes" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.899800 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.903609 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 11:23:30 crc kubenswrapper[4990]: I1003 11:23:30.905592 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.086802 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db566b797-8xnmw"] Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.088323 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.110652 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db566b797-8xnmw"] Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.153591 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-sb\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.153657 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-nb\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.153682 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-dns-svc\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.153700 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vphq\" (UniqueName: \"kubernetes.io/projected/5cc2b167-7f62-4c56-9654-58ea4cb892cf-kube-api-access-6vphq\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.153727 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-config\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.256271 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-sb\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.256328 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-nb\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.256354 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-dns-svc\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.256371 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vphq\" (UniqueName: \"kubernetes.io/projected/5cc2b167-7f62-4c56-9654-58ea4cb892cf-kube-api-access-6vphq\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.256394 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-config\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.257716 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-dns-svc\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.258117 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-nb\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.258196 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-sb\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.258413 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-config\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.283730 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vphq\" (UniqueName: \"kubernetes.io/projected/5cc2b167-7f62-4c56-9654-58ea4cb892cf-kube-api-access-6vphq\") pod \"dnsmasq-dns-db566b797-8xnmw\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.409210 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.678071 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db566b797-8xnmw"] Oct 03 11:23:31 crc kubenswrapper[4990]: I1003 11:23:31.911518 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db566b797-8xnmw" event={"ID":"5cc2b167-7f62-4c56-9654-58ea4cb892cf","Type":"ContainerStarted","Data":"b52af6258ac3524ddce29ed4dae979a24c845cc47cce06fc51e701c80c00c320"} Oct 03 11:23:32 crc kubenswrapper[4990]: I1003 11:23:32.919644 4990 generic.go:334] "Generic (PLEG): container finished" podID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" containerID="d9044de82add60e2e5438d5d5b644b628128a2f5bfe290e6b3832ce24696b46f" exitCode=0 Oct 03 11:23:32 crc kubenswrapper[4990]: I1003 11:23:32.920045 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db566b797-8xnmw" event={"ID":"5cc2b167-7f62-4c56-9654-58ea4cb892cf","Type":"ContainerDied","Data":"d9044de82add60e2e5438d5d5b644b628128a2f5bfe290e6b3832ce24696b46f"} Oct 03 11:23:33 crc kubenswrapper[4990]: I1003 11:23:33.932614 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db566b797-8xnmw" event={"ID":"5cc2b167-7f62-4c56-9654-58ea4cb892cf","Type":"ContainerStarted","Data":"1d6a59ed9a8e5356176c7f83abd61c4f574da44da3f0207502e1d0bad11ddb2a"} Oct 03 11:23:33 crc kubenswrapper[4990]: I1003 11:23:33.933149 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:33 crc kubenswrapper[4990]: I1003 11:23:33.954798 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db566b797-8xnmw" podStartSLOduration=2.95477234 podStartE2EDuration="2.95477234s" podCreationTimestamp="2025-10-03 11:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:23:33.952281536 +0000 UTC m=+5995.748913403" watchObservedRunningTime="2025-10-03 11:23:33.95477234 +0000 UTC m=+5995.751404217" Oct 03 11:23:34 crc kubenswrapper[4990]: I1003 11:23:34.224986 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:34 crc kubenswrapper[4990]: I1003 11:23:34.225559 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-log" containerID="cri-o://3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a" gracePeriod=30 Oct 03 11:23:34 crc kubenswrapper[4990]: I1003 11:23:34.225659 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-api" containerID="cri-o://b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68" gracePeriod=30 Oct 03 11:23:34 crc kubenswrapper[4990]: I1003 11:23:34.943803 4990 generic.go:334] "Generic (PLEG): container finished" podID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerID="3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a" exitCode=143 Oct 03 11:23:34 crc kubenswrapper[4990]: I1003 11:23:34.943871 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"158bbddd-d906-4bed-85c9-e4587c424ee6","Type":"ContainerDied","Data":"3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a"} Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.925169 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.974927 4990 generic.go:334] "Generic (PLEG): container finished" podID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerID="b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68" exitCode=0 Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.974989 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"158bbddd-d906-4bed-85c9-e4587c424ee6","Type":"ContainerDied","Data":"b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68"} Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.975028 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"158bbddd-d906-4bed-85c9-e4587c424ee6","Type":"ContainerDied","Data":"d23415fa78599274c70232234deac112f00419f65847ed053546902e4f933a07"} Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.975058 4990 scope.go:117] "RemoveContainer" containerID="b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68" Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.974991 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.984459 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-combined-ca-bundle\") pod \"158bbddd-d906-4bed-85c9-e4587c424ee6\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.984932 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hch9z\" (UniqueName: \"kubernetes.io/projected/158bbddd-d906-4bed-85c9-e4587c424ee6-kube-api-access-hch9z\") pod \"158bbddd-d906-4bed-85c9-e4587c424ee6\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.985069 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158bbddd-d906-4bed-85c9-e4587c424ee6-logs\") pod \"158bbddd-d906-4bed-85c9-e4587c424ee6\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.985176 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-config-data\") pod \"158bbddd-d906-4bed-85c9-e4587c424ee6\" (UID: \"158bbddd-d906-4bed-85c9-e4587c424ee6\") " Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.986030 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158bbddd-d906-4bed-85c9-e4587c424ee6-logs" (OuterVolumeSpecName: "logs") pod "158bbddd-d906-4bed-85c9-e4587c424ee6" (UID: "158bbddd-d906-4bed-85c9-e4587c424ee6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:23:37 crc kubenswrapper[4990]: I1003 11:23:37.993764 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158bbddd-d906-4bed-85c9-e4587c424ee6-kube-api-access-hch9z" (OuterVolumeSpecName: "kube-api-access-hch9z") pod "158bbddd-d906-4bed-85c9-e4587c424ee6" (UID: "158bbddd-d906-4bed-85c9-e4587c424ee6"). InnerVolumeSpecName "kube-api-access-hch9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.019708 4990 scope.go:117] "RemoveContainer" containerID="3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.029067 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-config-data" (OuterVolumeSpecName: "config-data") pod "158bbddd-d906-4bed-85c9-e4587c424ee6" (UID: "158bbddd-d906-4bed-85c9-e4587c424ee6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.029351 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "158bbddd-d906-4bed-85c9-e4587c424ee6" (UID: "158bbddd-d906-4bed-85c9-e4587c424ee6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.088339 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/158bbddd-d906-4bed-85c9-e4587c424ee6-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.106601 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.106652 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158bbddd-d906-4bed-85c9-e4587c424ee6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.106678 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hch9z\" (UniqueName: \"kubernetes.io/projected/158bbddd-d906-4bed-85c9-e4587c424ee6-kube-api-access-hch9z\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.117116 4990 scope.go:117] "RemoveContainer" containerID="b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68" Oct 03 11:23:38 crc kubenswrapper[4990]: E1003 11:23:38.117749 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68\": container with ID starting with b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68 not found: ID does not exist" containerID="b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.117790 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68"} err="failed to get container status \"b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68\": rpc error: code = NotFound desc = could not find container \"b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68\": container with ID starting with b55a41655ead4d66438dcbb9e173b605402eeeb96a678f5077ff7f282fbf9c68 not found: ID does not exist" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.117815 4990 scope.go:117] "RemoveContainer" containerID="3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a" Oct 03 11:23:38 crc kubenswrapper[4990]: E1003 11:23:38.118094 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a\": container with ID starting with 3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a not found: ID does not exist" containerID="3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.118193 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a"} err="failed to get container status \"3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a\": rpc error: code = NotFound desc = could not find container \"3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a\": container with ID starting with 3972e088fd5d35471a8442614896dae1787a68d729296118721e03dfce3dda3a not found: ID does not exist" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.316793 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.326077 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.334044 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:38 crc kubenswrapper[4990]: E1003 11:23:38.334449 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-log" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.334466 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-log" Oct 03 11:23:38 crc kubenswrapper[4990]: E1003 11:23:38.334484 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-api" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.334492 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-api" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.334702 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-api" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.334724 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" containerName="nova-api-log" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.336311 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.339962 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.340126 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.340265 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.342473 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.412686 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.412829 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7666\" (UniqueName: \"kubernetes.io/projected/c9929494-48ad-4171-89f4-654931af17a8-kube-api-access-n7666\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.412867 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9929494-48ad-4171-89f4-654931af17a8-logs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.412911 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.412963 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-config-data\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.413002 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.514825 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.514958 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7666\" (UniqueName: \"kubernetes.io/projected/c9929494-48ad-4171-89f4-654931af17a8-kube-api-access-n7666\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.514986 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9929494-48ad-4171-89f4-654931af17a8-logs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.515074 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.515176 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-config-data\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.515264 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.515461 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9929494-48ad-4171-89f4-654931af17a8-logs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.520854 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.521652 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-config-data\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.523049 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.523438 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-public-tls-certs\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.541458 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7666\" (UniqueName: \"kubernetes.io/projected/c9929494-48ad-4171-89f4-654931af17a8-kube-api-access-n7666\") pod \"nova-api-0\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.655132 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 11:23:38 crc kubenswrapper[4990]: I1003 11:23:38.896002 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158bbddd-d906-4bed-85c9-e4587c424ee6" path="/var/lib/kubelet/pods/158bbddd-d906-4bed-85c9-e4587c424ee6/volumes" Oct 03 11:23:39 crc kubenswrapper[4990]: I1003 11:23:39.156778 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 11:23:39 crc kubenswrapper[4990]: I1003 11:23:39.871791 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:23:39 crc kubenswrapper[4990]: E1003 11:23:39.872395 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:23:39 crc kubenswrapper[4990]: I1003 11:23:39.995885 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9929494-48ad-4171-89f4-654931af17a8","Type":"ContainerStarted","Data":"438fae3c516281efd9b6614bc2d17cbaf00a08b935c2a1bfa0583fc3ac27cec2"} Oct 03 11:23:39 crc kubenswrapper[4990]: I1003 11:23:39.995939 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9929494-48ad-4171-89f4-654931af17a8","Type":"ContainerStarted","Data":"39d718c35f7c6da257a68938ac8e2401fabb6923863e7a3d489d25de81723bab"} Oct 03 11:23:39 crc kubenswrapper[4990]: I1003 11:23:39.995953 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9929494-48ad-4171-89f4-654931af17a8","Type":"ContainerStarted","Data":"a156fdb9ab7ac6452eeea03f185dc12b0b980607e998108f470263da0ccd806f"} Oct 03 11:23:40 crc kubenswrapper[4990]: I1003 11:23:40.016899 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.016884745 podStartE2EDuration="2.016884745s" podCreationTimestamp="2025-10-03 11:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:23:40.013046016 +0000 UTC m=+6001.809677873" watchObservedRunningTime="2025-10-03 11:23:40.016884745 +0000 UTC m=+6001.813516602" Oct 03 11:23:41 crc kubenswrapper[4990]: I1003 11:23:41.410694 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:23:41 crc kubenswrapper[4990]: I1003 11:23:41.491558 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64678d8c55-rnd9s"] Oct 03 11:23:41 crc kubenswrapper[4990]: I1003 11:23:41.491816 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" podUID="711df095-f8d6-419b-8586-e738ca12b49c" containerName="dnsmasq-dns" containerID="cri-o://2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d" gracePeriod=10 Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.007615 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.020382 4990 generic.go:334] "Generic (PLEG): container finished" podID="711df095-f8d6-419b-8586-e738ca12b49c" containerID="2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d" exitCode=0 Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.020423 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" event={"ID":"711df095-f8d6-419b-8586-e738ca12b49c","Type":"ContainerDied","Data":"2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d"} Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.020457 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" event={"ID":"711df095-f8d6-419b-8586-e738ca12b49c","Type":"ContainerDied","Data":"9e6f2c18980df4a9f015fe879ffecddb31b2bd81bb172318eb0be2278f021d68"} Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.020468 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64678d8c55-rnd9s" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.020477 4990 scope.go:117] "RemoveContainer" containerID="2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.076887 4990 scope.go:117] "RemoveContainer" containerID="c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.092238 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-nb\") pod \"711df095-f8d6-419b-8586-e738ca12b49c\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.092349 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-sb\") pod \"711df095-f8d6-419b-8586-e738ca12b49c\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.092377 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-config\") pod \"711df095-f8d6-419b-8586-e738ca12b49c\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.092445 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wc7d\" (UniqueName: \"kubernetes.io/projected/711df095-f8d6-419b-8586-e738ca12b49c-kube-api-access-6wc7d\") pod \"711df095-f8d6-419b-8586-e738ca12b49c\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.092503 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-dns-svc\") pod \"711df095-f8d6-419b-8586-e738ca12b49c\" (UID: \"711df095-f8d6-419b-8586-e738ca12b49c\") " Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.099345 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711df095-f8d6-419b-8586-e738ca12b49c-kube-api-access-6wc7d" (OuterVolumeSpecName: "kube-api-access-6wc7d") pod "711df095-f8d6-419b-8586-e738ca12b49c" (UID: "711df095-f8d6-419b-8586-e738ca12b49c"). InnerVolumeSpecName "kube-api-access-6wc7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.116651 4990 scope.go:117] "RemoveContainer" containerID="2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d" Oct 03 11:23:42 crc kubenswrapper[4990]: E1003 11:23:42.127069 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d\": container with ID starting with 2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d not found: ID does not exist" containerID="2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.127116 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d"} err="failed to get container status \"2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d\": rpc error: code = NotFound desc = could not find container \"2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d\": container with ID starting with 2cdf3223be2e7eadbfc8f70c4dd73eace25f172acd94d0f6e638a64bf339390d not found: ID does not exist" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.127138 4990 scope.go:117] "RemoveContainer" containerID="c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223" Oct 03 11:23:42 crc kubenswrapper[4990]: E1003 11:23:42.138803 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223\": container with ID starting with c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223 not found: ID does not exist" containerID="c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.138841 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223"} err="failed to get container status \"c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223\": rpc error: code = NotFound desc = could not find container \"c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223\": container with ID starting with c03285a71ecfee2a7aed60485e8b81c0577a2a7a4bab0e66992e7cfcf4853223 not found: ID does not exist" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.166269 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "711df095-f8d6-419b-8586-e738ca12b49c" (UID: "711df095-f8d6-419b-8586-e738ca12b49c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.171993 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "711df095-f8d6-419b-8586-e738ca12b49c" (UID: "711df095-f8d6-419b-8586-e738ca12b49c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.173225 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "711df095-f8d6-419b-8586-e738ca12b49c" (UID: "711df095-f8d6-419b-8586-e738ca12b49c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.187811 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-config" (OuterVolumeSpecName: "config") pod "711df095-f8d6-419b-8586-e738ca12b49c" (UID: "711df095-f8d6-419b-8586-e738ca12b49c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.194687 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.194724 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.194738 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wc7d\" (UniqueName: \"kubernetes.io/projected/711df095-f8d6-419b-8586-e738ca12b49c-kube-api-access-6wc7d\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.194750 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.194763 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/711df095-f8d6-419b-8586-e738ca12b49c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.357826 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64678d8c55-rnd9s"] Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.365741 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64678d8c55-rnd9s"] Oct 03 11:23:42 crc kubenswrapper[4990]: I1003 11:23:42.881929 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711df095-f8d6-419b-8586-e738ca12b49c" path="/var/lib/kubelet/pods/711df095-f8d6-419b-8586-e738ca12b49c/volumes" Oct 03 11:23:43 crc kubenswrapper[4990]: I1003 11:23:43.036364 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bbj4x"] Oct 03 11:23:43 crc kubenswrapper[4990]: I1003 11:23:43.045648 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bbj4x"] Oct 03 11:23:44 crc kubenswrapper[4990]: I1003 11:23:44.886303 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="338090dc-62dd-40a5-b506-df026387f291" path="/var/lib/kubelet/pods/338090dc-62dd-40a5-b506-df026387f291/volumes" Oct 03 11:23:48 crc kubenswrapper[4990]: I1003 11:23:48.656619 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 11:23:48 crc kubenswrapper[4990]: I1003 11:23:48.657209 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 11:23:49 crc kubenswrapper[4990]: I1003 11:23:49.362030 4990 scope.go:117] "RemoveContainer" containerID="c0676911ca071fa924ef341ee2fa511f140639eeef1c840153ab8e60d4397635" Oct 03 11:23:49 crc kubenswrapper[4990]: I1003 11:23:49.405825 4990 scope.go:117] "RemoveContainer" containerID="471e0fd6b86a5fcf70dce1249d3fd6e7fb9b0ce35f8fed1b07bc14cd18ddb277" Oct 03 11:23:49 crc kubenswrapper[4990]: I1003 11:23:49.446351 4990 scope.go:117] "RemoveContainer" containerID="1867575ee49c1c0ef0338fa9444c73c2615236cdc96a2566c35141a06b7d555c" Oct 03 11:23:49 crc kubenswrapper[4990]: I1003 11:23:49.484916 4990 scope.go:117] "RemoveContainer" containerID="38cf246f6f532d86e2687493edd62607f2c2cebf46f90fd2071a02e221d03976" Oct 03 11:23:49 crc kubenswrapper[4990]: I1003 11:23:49.516492 4990 scope.go:117] "RemoveContainer" containerID="259cc44370d05973d8a599db38a9e1d374badf2fdce3be1dc295cc605628efab" Oct 03 11:23:49 crc kubenswrapper[4990]: I1003 11:23:49.553601 4990 scope.go:117] "RemoveContainer" containerID="deb3141b384b64896688b9c16243935c80d59345a45c620c634d1e0fa2af6064" Oct 03 11:23:49 crc kubenswrapper[4990]: I1003 11:23:49.671715 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.97:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 11:23:49 crc kubenswrapper[4990]: I1003 11:23:49.671719 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.97:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 11:23:54 crc kubenswrapper[4990]: I1003 11:23:54.872444 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:23:54 crc kubenswrapper[4990]: E1003 11:23:54.873233 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:23:58 crc kubenswrapper[4990]: I1003 11:23:58.662767 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 11:23:58 crc kubenswrapper[4990]: I1003 11:23:58.663586 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 11:23:58 crc kubenswrapper[4990]: I1003 11:23:58.663771 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 11:23:58 crc kubenswrapper[4990]: I1003 11:23:58.663797 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 11:23:58 crc kubenswrapper[4990]: I1003 11:23:58.672904 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 11:23:58 crc kubenswrapper[4990]: I1003 11:23:58.682475 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 11:24:07 crc kubenswrapper[4990]: I1003 11:24:07.871617 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:24:08 crc kubenswrapper[4990]: I1003 11:24:08.302952 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"ede822eb416de1cd2298678efb3de56140b965041f9c6d5dad16b6ae484057a3"} Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.162172 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vlqm5"] Oct 03 11:24:20 crc kubenswrapper[4990]: E1003 11:24:20.163294 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711df095-f8d6-419b-8586-e738ca12b49c" containerName="dnsmasq-dns" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.163311 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="711df095-f8d6-419b-8586-e738ca12b49c" containerName="dnsmasq-dns" Oct 03 11:24:20 crc kubenswrapper[4990]: E1003 11:24:20.163360 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711df095-f8d6-419b-8586-e738ca12b49c" containerName="init" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.163368 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="711df095-f8d6-419b-8586-e738ca12b49c" containerName="init" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.163630 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="711df095-f8d6-419b-8586-e738ca12b49c" containerName="dnsmasq-dns" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.164468 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.166895 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.166908 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.167073 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-wkhz2" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.173273 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vlqm5"] Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.195467 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4brg4"] Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.198819 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.221584 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4brg4"] Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.280955 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4692a72f-d9d5-40ea-bd63-e8c939b254e9-combined-ca-bundle\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281033 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-log-ovn\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281057 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wg95\" (UniqueName: \"kubernetes.io/projected/dfc0151d-547f-4110-a1ea-707ba0e18796-kube-api-access-2wg95\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281078 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-lib\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281098 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4692a72f-d9d5-40ea-bd63-e8c939b254e9-ovn-controller-tls-certs\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281115 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-log\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281136 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-run\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281151 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-run-ovn\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281187 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfc0151d-547f-4110-a1ea-707ba0e18796-scripts\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281208 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-run\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281232 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp5h\" (UniqueName: \"kubernetes.io/projected/4692a72f-d9d5-40ea-bd63-e8c939b254e9-kube-api-access-nkp5h\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281354 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4692a72f-d9d5-40ea-bd63-e8c939b254e9-scripts\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.281456 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-etc-ovs\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383263 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4692a72f-d9d5-40ea-bd63-e8c939b254e9-scripts\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383333 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-etc-ovs\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383409 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4692a72f-d9d5-40ea-bd63-e8c939b254e9-combined-ca-bundle\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383470 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-log-ovn\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383496 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wg95\" (UniqueName: \"kubernetes.io/projected/dfc0151d-547f-4110-a1ea-707ba0e18796-kube-api-access-2wg95\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383541 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-lib\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383570 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4692a72f-d9d5-40ea-bd63-e8c939b254e9-ovn-controller-tls-certs\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383590 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-log\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383614 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-run\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383635 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-run-ovn\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383676 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfc0151d-547f-4110-a1ea-707ba0e18796-scripts\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383704 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-run\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383736 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp5h\" (UniqueName: \"kubernetes.io/projected/4692a72f-d9d5-40ea-bd63-e8c939b254e9-kube-api-access-nkp5h\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383796 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-etc-ovs\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383820 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-lib\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383820 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-log-ovn\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383820 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-run\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383820 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-run-ovn\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383881 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4692a72f-d9d5-40ea-bd63-e8c939b254e9-var-run\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.383975 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dfc0151d-547f-4110-a1ea-707ba0e18796-var-log\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.387350 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dfc0151d-547f-4110-a1ea-707ba0e18796-scripts\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.391250 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4692a72f-d9d5-40ea-bd63-e8c939b254e9-scripts\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.394265 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4692a72f-d9d5-40ea-bd63-e8c939b254e9-ovn-controller-tls-certs\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.394280 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4692a72f-d9d5-40ea-bd63-e8c939b254e9-combined-ca-bundle\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.401208 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp5h\" (UniqueName: \"kubernetes.io/projected/4692a72f-d9d5-40ea-bd63-e8c939b254e9-kube-api-access-nkp5h\") pod \"ovn-controller-vlqm5\" (UID: \"4692a72f-d9d5-40ea-bd63-e8c939b254e9\") " pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.405789 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wg95\" (UniqueName: \"kubernetes.io/projected/dfc0151d-547f-4110-a1ea-707ba0e18796-kube-api-access-2wg95\") pod \"ovn-controller-ovs-4brg4\" (UID: \"dfc0151d-547f-4110-a1ea-707ba0e18796\") " pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.493342 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.518780 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:20 crc kubenswrapper[4990]: I1003 11:24:20.985111 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vlqm5"] Oct 03 11:24:21 crc kubenswrapper[4990]: W1003 11:24:21.137167 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4692a72f_d9d5_40ea_bd63_e8c939b254e9.slice/crio-5dbad661808101e2363fbea195b4b3d5dab6385a3da62cc5f4f7764179b8c861 WatchSource:0}: Error finding container 5dbad661808101e2363fbea195b4b3d5dab6385a3da62cc5f4f7764179b8c861: Status 404 returned error can't find the container with id 5dbad661808101e2363fbea195b4b3d5dab6385a3da62cc5f4f7764179b8c861 Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.327312 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4brg4"] Oct 03 11:24:21 crc kubenswrapper[4990]: W1003 11:24:21.337434 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfc0151d_547f_4110_a1ea_707ba0e18796.slice/crio-10b3105c35b1c220c002513906b63df6dd2bd7bca5e035ef1b995c7ca7cd8ac8 WatchSource:0}: Error finding container 10b3105c35b1c220c002513906b63df6dd2bd7bca5e035ef1b995c7ca7cd8ac8: Status 404 returned error can't find the container with id 10b3105c35b1c220c002513906b63df6dd2bd7bca5e035ef1b995c7ca7cd8ac8 Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.515477 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4brg4" event={"ID":"dfc0151d-547f-4110-a1ea-707ba0e18796","Type":"ContainerStarted","Data":"10b3105c35b1c220c002513906b63df6dd2bd7bca5e035ef1b995c7ca7cd8ac8"} Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.516888 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vlqm5" event={"ID":"4692a72f-d9d5-40ea-bd63-e8c939b254e9","Type":"ContainerStarted","Data":"d00e860b0fefa78e0f67e5b2c22ff406a78d4260c98eb79ac453d9a739399672"} Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.516912 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vlqm5" event={"ID":"4692a72f-d9d5-40ea-bd63-e8c939b254e9","Type":"ContainerStarted","Data":"5dbad661808101e2363fbea195b4b3d5dab6385a3da62cc5f4f7764179b8c861"} Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.517049 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.538346 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vlqm5" podStartSLOduration=1.538330213 podStartE2EDuration="1.538330213s" podCreationTimestamp="2025-10-03 11:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:24:21.532455531 +0000 UTC m=+6043.329087428" watchObservedRunningTime="2025-10-03 11:24:21.538330213 +0000 UTC m=+6043.334962070" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.720031 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-964vn"] Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.733040 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.745291 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.746799 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-964vn"] Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.822395 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3970bc67-319d-4676-b0f7-d0323d315f77-config\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.822484 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3970bc67-319d-4676-b0f7-d0323d315f77-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.822568 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3970bc67-319d-4676-b0f7-d0323d315f77-ovs-rundir\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.822620 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3970bc67-319d-4676-b0f7-d0323d315f77-ovn-rundir\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.822733 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2wc9\" (UniqueName: \"kubernetes.io/projected/3970bc67-319d-4676-b0f7-d0323d315f77-kube-api-access-d2wc9\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.822784 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3970bc67-319d-4676-b0f7-d0323d315f77-combined-ca-bundle\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.923885 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3970bc67-319d-4676-b0f7-d0323d315f77-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.924184 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3970bc67-319d-4676-b0f7-d0323d315f77-ovs-rundir\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.924229 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3970bc67-319d-4676-b0f7-d0323d315f77-ovn-rundir\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.924337 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2wc9\" (UniqueName: \"kubernetes.io/projected/3970bc67-319d-4676-b0f7-d0323d315f77-kube-api-access-d2wc9\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.924369 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3970bc67-319d-4676-b0f7-d0323d315f77-combined-ca-bundle\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.924395 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3970bc67-319d-4676-b0f7-d0323d315f77-config\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.924537 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3970bc67-319d-4676-b0f7-d0323d315f77-ovs-rundir\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.924563 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3970bc67-319d-4676-b0f7-d0323d315f77-ovn-rundir\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.925100 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3970bc67-319d-4676-b0f7-d0323d315f77-config\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.929485 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3970bc67-319d-4676-b0f7-d0323d315f77-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.929736 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3970bc67-319d-4676-b0f7-d0323d315f77-combined-ca-bundle\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:21 crc kubenswrapper[4990]: I1003 11:24:21.944148 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2wc9\" (UniqueName: \"kubernetes.io/projected/3970bc67-319d-4676-b0f7-d0323d315f77-kube-api-access-d2wc9\") pod \"ovn-controller-metrics-964vn\" (UID: \"3970bc67-319d-4676-b0f7-d0323d315f77\") " pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.058405 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-v68xc"] Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.059943 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-v68xc" Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.074107 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-v68xc"] Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.101237 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-964vn" Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.128007 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdbk\" (UniqueName: \"kubernetes.io/projected/0059b5d8-652d-4a65-a5ba-cacea85aac53-kube-api-access-8tdbk\") pod \"octavia-db-create-v68xc\" (UID: \"0059b5d8-652d-4a65-a5ba-cacea85aac53\") " pod="openstack/octavia-db-create-v68xc" Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.232683 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdbk\" (UniqueName: \"kubernetes.io/projected/0059b5d8-652d-4a65-a5ba-cacea85aac53-kube-api-access-8tdbk\") pod \"octavia-db-create-v68xc\" (UID: \"0059b5d8-652d-4a65-a5ba-cacea85aac53\") " pod="openstack/octavia-db-create-v68xc" Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.257479 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdbk\" (UniqueName: \"kubernetes.io/projected/0059b5d8-652d-4a65-a5ba-cacea85aac53-kube-api-access-8tdbk\") pod \"octavia-db-create-v68xc\" (UID: \"0059b5d8-652d-4a65-a5ba-cacea85aac53\") " pod="openstack/octavia-db-create-v68xc" Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.385849 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-v68xc" Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.557120 4990 generic.go:334] "Generic (PLEG): container finished" podID="dfc0151d-547f-4110-a1ea-707ba0e18796" containerID="274ebc53bebc1cdf895ead1c5b4631d4f172f0ec844671b50d90a2e788ad68d5" exitCode=0 Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.557198 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4brg4" event={"ID":"dfc0151d-547f-4110-a1ea-707ba0e18796","Type":"ContainerDied","Data":"274ebc53bebc1cdf895ead1c5b4631d4f172f0ec844671b50d90a2e788ad68d5"} Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.589787 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-964vn"] Oct 03 11:24:22 crc kubenswrapper[4990]: I1003 11:24:22.863991 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-v68xc"] Oct 03 11:24:22 crc kubenswrapper[4990]: W1003 11:24:22.873089 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0059b5d8_652d_4a65_a5ba_cacea85aac53.slice/crio-40f8246344bc16c7475432ab1d9f1b20bf3b8e59a3c29ffffc9173e76137cb31 WatchSource:0}: Error finding container 40f8246344bc16c7475432ab1d9f1b20bf3b8e59a3c29ffffc9173e76137cb31: Status 404 returned error can't find the container with id 40f8246344bc16c7475432ab1d9f1b20bf3b8e59a3c29ffffc9173e76137cb31 Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.570704 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4brg4" event={"ID":"dfc0151d-547f-4110-a1ea-707ba0e18796","Type":"ContainerStarted","Data":"9542efe189af5d71002c29dbe6cdf249708434f7191cccc0026310f716ec9cff"} Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.571112 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.571129 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.571140 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4brg4" event={"ID":"dfc0151d-547f-4110-a1ea-707ba0e18796","Type":"ContainerStarted","Data":"52ab5c4711e70b1548c0a8d8ca6b8c250a77ddd086f0e3f4645eac259c985a0f"} Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.572878 4990 generic.go:334] "Generic (PLEG): container finished" podID="0059b5d8-652d-4a65-a5ba-cacea85aac53" containerID="288318739d759cf22a0b5b2936a57bcdc233bf00a39340a08b48ee34ea3618be" exitCode=0 Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.572917 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-v68xc" event={"ID":"0059b5d8-652d-4a65-a5ba-cacea85aac53","Type":"ContainerDied","Data":"288318739d759cf22a0b5b2936a57bcdc233bf00a39340a08b48ee34ea3618be"} Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.572944 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-v68xc" event={"ID":"0059b5d8-652d-4a65-a5ba-cacea85aac53","Type":"ContainerStarted","Data":"40f8246344bc16c7475432ab1d9f1b20bf3b8e59a3c29ffffc9173e76137cb31"} Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.574801 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-964vn" event={"ID":"3970bc67-319d-4676-b0f7-d0323d315f77","Type":"ContainerStarted","Data":"ce269c27d17557e6eeb4221ac7829d21ac0b7ba2b520d6ef6be5d0fbaccdd4b2"} Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.574823 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-964vn" event={"ID":"3970bc67-319d-4676-b0f7-d0323d315f77","Type":"ContainerStarted","Data":"b637d87e6e7aac5016d2ff8715676c487b56e3dc020ce5a3def036a7db1e0fa1"} Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.596799 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4brg4" podStartSLOduration=3.596782889 podStartE2EDuration="3.596782889s" podCreationTimestamp="2025-10-03 11:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:24:23.591988865 +0000 UTC m=+6045.388620722" watchObservedRunningTime="2025-10-03 11:24:23.596782889 +0000 UTC m=+6045.393414746" Oct 03 11:24:23 crc kubenswrapper[4990]: I1003 11:24:23.629528 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-964vn" podStartSLOduration=2.629491583 podStartE2EDuration="2.629491583s" podCreationTimestamp="2025-10-03 11:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:24:23.621990659 +0000 UTC m=+6045.418622516" watchObservedRunningTime="2025-10-03 11:24:23.629491583 +0000 UTC m=+6045.426123440" Oct 03 11:24:24 crc kubenswrapper[4990]: I1003 11:24:24.956563 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-v68xc" Oct 03 11:24:24 crc kubenswrapper[4990]: I1003 11:24:24.994525 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdbk\" (UniqueName: \"kubernetes.io/projected/0059b5d8-652d-4a65-a5ba-cacea85aac53-kube-api-access-8tdbk\") pod \"0059b5d8-652d-4a65-a5ba-cacea85aac53\" (UID: \"0059b5d8-652d-4a65-a5ba-cacea85aac53\") " Oct 03 11:24:25 crc kubenswrapper[4990]: I1003 11:24:25.001909 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0059b5d8-652d-4a65-a5ba-cacea85aac53-kube-api-access-8tdbk" (OuterVolumeSpecName: "kube-api-access-8tdbk") pod "0059b5d8-652d-4a65-a5ba-cacea85aac53" (UID: "0059b5d8-652d-4a65-a5ba-cacea85aac53"). InnerVolumeSpecName "kube-api-access-8tdbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:24:25 crc kubenswrapper[4990]: I1003 11:24:25.096919 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdbk\" (UniqueName: \"kubernetes.io/projected/0059b5d8-652d-4a65-a5ba-cacea85aac53-kube-api-access-8tdbk\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:25 crc kubenswrapper[4990]: I1003 11:24:25.597096 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-v68xc" event={"ID":"0059b5d8-652d-4a65-a5ba-cacea85aac53","Type":"ContainerDied","Data":"40f8246344bc16c7475432ab1d9f1b20bf3b8e59a3c29ffffc9173e76137cb31"} Oct 03 11:24:25 crc kubenswrapper[4990]: I1003 11:24:25.597135 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-v68xc" Oct 03 11:24:25 crc kubenswrapper[4990]: I1003 11:24:25.597143 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40f8246344bc16c7475432ab1d9f1b20bf3b8e59a3c29ffffc9173e76137cb31" Oct 03 11:24:32 crc kubenswrapper[4990]: I1003 11:24:32.996470 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-cafd-account-create-g48vw"] Oct 03 11:24:32 crc kubenswrapper[4990]: E1003 11:24:32.997552 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0059b5d8-652d-4a65-a5ba-cacea85aac53" containerName="mariadb-database-create" Oct 03 11:24:32 crc kubenswrapper[4990]: I1003 11:24:32.997573 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0059b5d8-652d-4a65-a5ba-cacea85aac53" containerName="mariadb-database-create" Oct 03 11:24:32 crc kubenswrapper[4990]: I1003 11:24:32.997753 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0059b5d8-652d-4a65-a5ba-cacea85aac53" containerName="mariadb-database-create" Oct 03 11:24:32 crc kubenswrapper[4990]: I1003 11:24:32.998380 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cafd-account-create-g48vw" Oct 03 11:24:33 crc kubenswrapper[4990]: I1003 11:24:33.001992 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 03 11:24:33 crc kubenswrapper[4990]: I1003 11:24:33.018880 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-cafd-account-create-g48vw"] Oct 03 11:24:33 crc kubenswrapper[4990]: I1003 11:24:33.070505 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfkv\" (UniqueName: \"kubernetes.io/projected/9787f12f-7dca-40cb-a7fb-274c9ddd10a0-kube-api-access-pkfkv\") pod \"octavia-cafd-account-create-g48vw\" (UID: \"9787f12f-7dca-40cb-a7fb-274c9ddd10a0\") " pod="openstack/octavia-cafd-account-create-g48vw" Oct 03 11:24:33 crc kubenswrapper[4990]: I1003 11:24:33.171941 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfkv\" (UniqueName: \"kubernetes.io/projected/9787f12f-7dca-40cb-a7fb-274c9ddd10a0-kube-api-access-pkfkv\") pod \"octavia-cafd-account-create-g48vw\" (UID: \"9787f12f-7dca-40cb-a7fb-274c9ddd10a0\") " pod="openstack/octavia-cafd-account-create-g48vw" Oct 03 11:24:33 crc kubenswrapper[4990]: I1003 11:24:33.196199 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfkv\" (UniqueName: \"kubernetes.io/projected/9787f12f-7dca-40cb-a7fb-274c9ddd10a0-kube-api-access-pkfkv\") pod \"octavia-cafd-account-create-g48vw\" (UID: \"9787f12f-7dca-40cb-a7fb-274c9ddd10a0\") " pod="openstack/octavia-cafd-account-create-g48vw" Oct 03 11:24:33 crc kubenswrapper[4990]: I1003 11:24:33.353664 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cafd-account-create-g48vw" Oct 03 11:24:33 crc kubenswrapper[4990]: I1003 11:24:33.805914 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-cafd-account-create-g48vw"] Oct 03 11:24:34 crc kubenswrapper[4990]: I1003 11:24:34.698388 4990 generic.go:334] "Generic (PLEG): container finished" podID="9787f12f-7dca-40cb-a7fb-274c9ddd10a0" containerID="3a969e21e0c5507e1e54697b710ed472a73639808141c9ee23ceb0b78bd6eaca" exitCode=0 Oct 03 11:24:34 crc kubenswrapper[4990]: I1003 11:24:34.698435 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cafd-account-create-g48vw" event={"ID":"9787f12f-7dca-40cb-a7fb-274c9ddd10a0","Type":"ContainerDied","Data":"3a969e21e0c5507e1e54697b710ed472a73639808141c9ee23ceb0b78bd6eaca"} Oct 03 11:24:34 crc kubenswrapper[4990]: I1003 11:24:34.698464 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cafd-account-create-g48vw" event={"ID":"9787f12f-7dca-40cb-a7fb-274c9ddd10a0","Type":"ContainerStarted","Data":"9a8e1a2d582c1dc2709afccf2db71419f1abf5223989b0faa35c12830e7b56e7"} Oct 03 11:24:36 crc kubenswrapper[4990]: I1003 11:24:36.048285 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cafd-account-create-g48vw" Oct 03 11:24:36 crc kubenswrapper[4990]: I1003 11:24:36.131543 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfkv\" (UniqueName: \"kubernetes.io/projected/9787f12f-7dca-40cb-a7fb-274c9ddd10a0-kube-api-access-pkfkv\") pod \"9787f12f-7dca-40cb-a7fb-274c9ddd10a0\" (UID: \"9787f12f-7dca-40cb-a7fb-274c9ddd10a0\") " Oct 03 11:24:36 crc kubenswrapper[4990]: I1003 11:24:36.136228 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9787f12f-7dca-40cb-a7fb-274c9ddd10a0-kube-api-access-pkfkv" (OuterVolumeSpecName: "kube-api-access-pkfkv") pod "9787f12f-7dca-40cb-a7fb-274c9ddd10a0" (UID: "9787f12f-7dca-40cb-a7fb-274c9ddd10a0"). InnerVolumeSpecName "kube-api-access-pkfkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:24:36 crc kubenswrapper[4990]: I1003 11:24:36.233937 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfkv\" (UniqueName: \"kubernetes.io/projected/9787f12f-7dca-40cb-a7fb-274c9ddd10a0-kube-api-access-pkfkv\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:36 crc kubenswrapper[4990]: I1003 11:24:36.721126 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cafd-account-create-g48vw" event={"ID":"9787f12f-7dca-40cb-a7fb-274c9ddd10a0","Type":"ContainerDied","Data":"9a8e1a2d582c1dc2709afccf2db71419f1abf5223989b0faa35c12830e7b56e7"} Oct 03 11:24:36 crc kubenswrapper[4990]: I1003 11:24:36.721686 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a8e1a2d582c1dc2709afccf2db71419f1abf5223989b0faa35c12830e7b56e7" Oct 03 11:24:36 crc kubenswrapper[4990]: I1003 11:24:36.721203 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cafd-account-create-g48vw" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.009657 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-zxhbz"] Oct 03 11:24:39 crc kubenswrapper[4990]: E1003 11:24:39.010550 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9787f12f-7dca-40cb-a7fb-274c9ddd10a0" containerName="mariadb-account-create" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.010569 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9787f12f-7dca-40cb-a7fb-274c9ddd10a0" containerName="mariadb-account-create" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.010824 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9787f12f-7dca-40cb-a7fb-274c9ddd10a0" containerName="mariadb-account-create" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.011682 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-zxhbz" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.018782 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-zxhbz"] Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.093907 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65mz\" (UniqueName: \"kubernetes.io/projected/11514f04-f626-42e7-881b-32ccab00f950-kube-api-access-c65mz\") pod \"octavia-persistence-db-create-zxhbz\" (UID: \"11514f04-f626-42e7-881b-32ccab00f950\") " pod="openstack/octavia-persistence-db-create-zxhbz" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.195579 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65mz\" (UniqueName: \"kubernetes.io/projected/11514f04-f626-42e7-881b-32ccab00f950-kube-api-access-c65mz\") pod \"octavia-persistence-db-create-zxhbz\" (UID: \"11514f04-f626-42e7-881b-32ccab00f950\") " pod="openstack/octavia-persistence-db-create-zxhbz" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.214015 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65mz\" (UniqueName: \"kubernetes.io/projected/11514f04-f626-42e7-881b-32ccab00f950-kube-api-access-c65mz\") pod \"octavia-persistence-db-create-zxhbz\" (UID: \"11514f04-f626-42e7-881b-32ccab00f950\") " pod="openstack/octavia-persistence-db-create-zxhbz" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.337049 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-zxhbz" Oct 03 11:24:39 crc kubenswrapper[4990]: I1003 11:24:39.790968 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-zxhbz"] Oct 03 11:24:39 crc kubenswrapper[4990]: W1003 11:24:39.792008 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11514f04_f626_42e7_881b_32ccab00f950.slice/crio-a0233ce9ca9eeb31829f73b77f9d88af78c3c195d8f417bd59b38b0051f7b435 WatchSource:0}: Error finding container a0233ce9ca9eeb31829f73b77f9d88af78c3c195d8f417bd59b38b0051f7b435: Status 404 returned error can't find the container with id a0233ce9ca9eeb31829f73b77f9d88af78c3c195d8f417bd59b38b0051f7b435 Oct 03 11:24:40 crc kubenswrapper[4990]: I1003 11:24:40.761300 4990 generic.go:334] "Generic (PLEG): container finished" podID="11514f04-f626-42e7-881b-32ccab00f950" containerID="f4915968a7edf0a003c53824846a154d587cef54d89409e43272b381c5221ad0" exitCode=0 Oct 03 11:24:40 crc kubenswrapper[4990]: I1003 11:24:40.761487 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-zxhbz" event={"ID":"11514f04-f626-42e7-881b-32ccab00f950","Type":"ContainerDied","Data":"f4915968a7edf0a003c53824846a154d587cef54d89409e43272b381c5221ad0"} Oct 03 11:24:40 crc kubenswrapper[4990]: I1003 11:24:40.761690 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-zxhbz" event={"ID":"11514f04-f626-42e7-881b-32ccab00f950","Type":"ContainerStarted","Data":"a0233ce9ca9eeb31829f73b77f9d88af78c3c195d8f417bd59b38b0051f7b435"} Oct 03 11:24:42 crc kubenswrapper[4990]: I1003 11:24:42.162742 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-zxhbz" Oct 03 11:24:42 crc kubenswrapper[4990]: I1003 11:24:42.254084 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65mz\" (UniqueName: \"kubernetes.io/projected/11514f04-f626-42e7-881b-32ccab00f950-kube-api-access-c65mz\") pod \"11514f04-f626-42e7-881b-32ccab00f950\" (UID: \"11514f04-f626-42e7-881b-32ccab00f950\") " Oct 03 11:24:42 crc kubenswrapper[4990]: I1003 11:24:42.259135 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11514f04-f626-42e7-881b-32ccab00f950-kube-api-access-c65mz" (OuterVolumeSpecName: "kube-api-access-c65mz") pod "11514f04-f626-42e7-881b-32ccab00f950" (UID: "11514f04-f626-42e7-881b-32ccab00f950"). InnerVolumeSpecName "kube-api-access-c65mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:24:42 crc kubenswrapper[4990]: I1003 11:24:42.357599 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65mz\" (UniqueName: \"kubernetes.io/projected/11514f04-f626-42e7-881b-32ccab00f950-kube-api-access-c65mz\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:42 crc kubenswrapper[4990]: I1003 11:24:42.784821 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-zxhbz" event={"ID":"11514f04-f626-42e7-881b-32ccab00f950","Type":"ContainerDied","Data":"a0233ce9ca9eeb31829f73b77f9d88af78c3c195d8f417bd59b38b0051f7b435"} Oct 03 11:24:42 crc kubenswrapper[4990]: I1003 11:24:42.785090 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0233ce9ca9eeb31829f73b77f9d88af78c3c195d8f417bd59b38b0051f7b435" Oct 03 11:24:42 crc kubenswrapper[4990]: I1003 11:24:42.784924 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-zxhbz" Oct 03 11:24:49 crc kubenswrapper[4990]: I1003 11:24:49.952052 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-2590-account-create-nzxrf"] Oct 03 11:24:49 crc kubenswrapper[4990]: E1003 11:24:49.954229 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11514f04-f626-42e7-881b-32ccab00f950" containerName="mariadb-database-create" Oct 03 11:24:49 crc kubenswrapper[4990]: I1003 11:24:49.954325 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="11514f04-f626-42e7-881b-32ccab00f950" containerName="mariadb-database-create" Oct 03 11:24:49 crc kubenswrapper[4990]: I1003 11:24:49.954608 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="11514f04-f626-42e7-881b-32ccab00f950" containerName="mariadb-database-create" Oct 03 11:24:49 crc kubenswrapper[4990]: I1003 11:24:49.966431 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2590-account-create-nzxrf" Oct 03 11:24:49 crc kubenswrapper[4990]: I1003 11:24:49.968853 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 03 11:24:49 crc kubenswrapper[4990]: I1003 11:24:49.977428 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2590-account-create-nzxrf"] Oct 03 11:24:50 crc kubenswrapper[4990]: I1003 11:24:50.021209 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm9l9\" (UniqueName: \"kubernetes.io/projected/c979979a-92fc-4157-b350-cb02173dd164-kube-api-access-qm9l9\") pod \"octavia-2590-account-create-nzxrf\" (UID: \"c979979a-92fc-4157-b350-cb02173dd164\") " pod="openstack/octavia-2590-account-create-nzxrf" Oct 03 11:24:50 crc kubenswrapper[4990]: I1003 11:24:50.135336 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm9l9\" (UniqueName: \"kubernetes.io/projected/c979979a-92fc-4157-b350-cb02173dd164-kube-api-access-qm9l9\") pod \"octavia-2590-account-create-nzxrf\" (UID: \"c979979a-92fc-4157-b350-cb02173dd164\") " pod="openstack/octavia-2590-account-create-nzxrf" Oct 03 11:24:50 crc kubenswrapper[4990]: I1003 11:24:50.174223 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm9l9\" (UniqueName: \"kubernetes.io/projected/c979979a-92fc-4157-b350-cb02173dd164-kube-api-access-qm9l9\") pod \"octavia-2590-account-create-nzxrf\" (UID: \"c979979a-92fc-4157-b350-cb02173dd164\") " pod="openstack/octavia-2590-account-create-nzxrf" Oct 03 11:24:50 crc kubenswrapper[4990]: I1003 11:24:50.286020 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2590-account-create-nzxrf" Oct 03 11:24:50 crc kubenswrapper[4990]: I1003 11:24:50.708084 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2590-account-create-nzxrf"] Oct 03 11:24:50 crc kubenswrapper[4990]: I1003 11:24:50.858313 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2590-account-create-nzxrf" event={"ID":"c979979a-92fc-4157-b350-cb02173dd164","Type":"ContainerStarted","Data":"e5ce7b89b4ad07c7b8ee590aa658969867bbd294237a789fcd930df526da02f9"} Oct 03 11:24:51 crc kubenswrapper[4990]: I1003 11:24:51.868869 4990 generic.go:334] "Generic (PLEG): container finished" podID="c979979a-92fc-4157-b350-cb02173dd164" containerID="7dda0d38a98f5ae29c3a9269f330daebd70b7007e274f6a67be6f42da1216544" exitCode=0 Oct 03 11:24:51 crc kubenswrapper[4990]: I1003 11:24:51.868952 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2590-account-create-nzxrf" event={"ID":"c979979a-92fc-4157-b350-cb02173dd164","Type":"ContainerDied","Data":"7dda0d38a98f5ae29c3a9269f330daebd70b7007e274f6a67be6f42da1216544"} Oct 03 11:24:53 crc kubenswrapper[4990]: I1003 11:24:53.263266 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2590-account-create-nzxrf" Oct 03 11:24:53 crc kubenswrapper[4990]: I1003 11:24:53.297180 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm9l9\" (UniqueName: \"kubernetes.io/projected/c979979a-92fc-4157-b350-cb02173dd164-kube-api-access-qm9l9\") pod \"c979979a-92fc-4157-b350-cb02173dd164\" (UID: \"c979979a-92fc-4157-b350-cb02173dd164\") " Oct 03 11:24:53 crc kubenswrapper[4990]: I1003 11:24:53.304788 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c979979a-92fc-4157-b350-cb02173dd164-kube-api-access-qm9l9" (OuterVolumeSpecName: "kube-api-access-qm9l9") pod "c979979a-92fc-4157-b350-cb02173dd164" (UID: "c979979a-92fc-4157-b350-cb02173dd164"). InnerVolumeSpecName "kube-api-access-qm9l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:24:53 crc kubenswrapper[4990]: I1003 11:24:53.399887 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm9l9\" (UniqueName: \"kubernetes.io/projected/c979979a-92fc-4157-b350-cb02173dd164-kube-api-access-qm9l9\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:53 crc kubenswrapper[4990]: I1003 11:24:53.892494 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2590-account-create-nzxrf" event={"ID":"c979979a-92fc-4157-b350-cb02173dd164","Type":"ContainerDied","Data":"e5ce7b89b4ad07c7b8ee590aa658969867bbd294237a789fcd930df526da02f9"} Oct 03 11:24:53 crc kubenswrapper[4990]: I1003 11:24:53.892561 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5ce7b89b4ad07c7b8ee590aa658969867bbd294237a789fcd930df526da02f9" Oct 03 11:24:53 crc kubenswrapper[4990]: I1003 11:24:53.892630 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2590-account-create-nzxrf" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.551266 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vlqm5" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.580319 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.589855 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4brg4" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.750130 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-54986ddb77-tpbdc"] Oct 03 11:24:55 crc kubenswrapper[4990]: E1003 11:24:55.750691 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c979979a-92fc-4157-b350-cb02173dd164" containerName="mariadb-account-create" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.750717 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c979979a-92fc-4157-b350-cb02173dd164" containerName="mariadb-account-create" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.753929 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c979979a-92fc-4157-b350-cb02173dd164" containerName="mariadb-account-create" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.755775 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.761503 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.761776 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.761885 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-j2j6k" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.762562 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.768915 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-54986ddb77-tpbdc"] Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.788412 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vlqm5-config-48fln"] Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.789731 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.795298 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.815594 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vlqm5-config-48fln"] Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855066 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-scripts\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855163 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855190 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data-merged\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855213 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-octavia-run\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855257 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855310 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run-ovn\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855365 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-additional-scripts\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855384 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-log-ovn\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855406 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-combined-ca-bundle\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855427 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-ovndb-tls-certs\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855456 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-scripts\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.855476 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7gd\" (UniqueName: \"kubernetes.io/projected/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-kube-api-access-4b7gd\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.957713 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-combined-ca-bundle\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.957754 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-ovndb-tls-certs\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.957780 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-scripts\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.957810 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7gd\" (UniqueName: \"kubernetes.io/projected/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-kube-api-access-4b7gd\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.957850 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-scripts\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.957922 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.957950 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data-merged\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.957974 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-octavia-run\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.958017 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.958083 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run-ovn\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.958136 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-additional-scripts\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.958158 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-log-ovn\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.958757 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.959055 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-octavia-run\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.959252 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-log-ovn\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.959371 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data-merged\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.960136 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run-ovn\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.960797 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-additional-scripts\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.961505 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-scripts\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.963238 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-ovndb-tls-certs\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.963298 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-combined-ca-bundle\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.963818 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-scripts\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.976288 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data\") pod \"octavia-api-54986ddb77-tpbdc\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:55 crc kubenswrapper[4990]: I1003 11:24:55.977793 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7gd\" (UniqueName: \"kubernetes.io/projected/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-kube-api-access-4b7gd\") pod \"ovn-controller-vlqm5-config-48fln\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:56 crc kubenswrapper[4990]: I1003 11:24:56.082382 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:24:56 crc kubenswrapper[4990]: I1003 11:24:56.129709 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:56 crc kubenswrapper[4990]: I1003 11:24:56.681088 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-54986ddb77-tpbdc"] Oct 03 11:24:56 crc kubenswrapper[4990]: I1003 11:24:56.688109 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 11:24:56 crc kubenswrapper[4990]: I1003 11:24:56.779699 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vlqm5-config-48fln"] Oct 03 11:24:56 crc kubenswrapper[4990]: W1003 11:24:56.789066 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7198cc91_a15e_4ffb_bfb1_bf1e92f606b3.slice/crio-118bee4baded7f69d3c392745307f13b85065d95f857d265a95a5f0fb818e52d WatchSource:0}: Error finding container 118bee4baded7f69d3c392745307f13b85065d95f857d265a95a5f0fb818e52d: Status 404 returned error can't find the container with id 118bee4baded7f69d3c392745307f13b85065d95f857d265a95a5f0fb818e52d Oct 03 11:24:56 crc kubenswrapper[4990]: I1003 11:24:56.921210 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54986ddb77-tpbdc" event={"ID":"0cbe069f-c085-4e49-9e43-66c376f9a465","Type":"ContainerStarted","Data":"b389b19422536bf62f528f18f4dbaf233612e02ba6f85bf399f440bc97686ccc"} Oct 03 11:24:56 crc kubenswrapper[4990]: I1003 11:24:56.922708 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vlqm5-config-48fln" event={"ID":"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3","Type":"ContainerStarted","Data":"118bee4baded7f69d3c392745307f13b85065d95f857d265a95a5f0fb818e52d"} Oct 03 11:24:57 crc kubenswrapper[4990]: I1003 11:24:57.935490 4990 generic.go:334] "Generic (PLEG): container finished" podID="7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" containerID="8a6412045df80c6344d107bc80b500a7a5206941658e3d2d8545c0f413a8234e" exitCode=0 Oct 03 11:24:57 crc kubenswrapper[4990]: I1003 11:24:57.935762 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vlqm5-config-48fln" event={"ID":"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3","Type":"ContainerDied","Data":"8a6412045df80c6344d107bc80b500a7a5206941658e3d2d8545c0f413a8234e"} Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.413928 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546080 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-additional-scripts\") pod \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546489 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b7gd\" (UniqueName: \"kubernetes.io/projected/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-kube-api-access-4b7gd\") pod \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546535 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-scripts\") pod \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546622 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-log-ovn\") pod \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546701 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run\") pod \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546733 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run-ovn\") pod \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\" (UID: \"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3\") " Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546805 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" (UID: "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546860 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run" (OuterVolumeSpecName: "var-run") pod "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" (UID: "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546895 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" (UID: "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.546986 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" (UID: "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.547557 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-scripts" (OuterVolumeSpecName: "scripts") pod "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" (UID: "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.548211 4990 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.548239 4990 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.548253 4990 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.548264 4990 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.548276 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.558802 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-kube-api-access-4b7gd" (OuterVolumeSpecName: "kube-api-access-4b7gd") pod "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" (UID: "7198cc91-a15e-4ffb-bfb1-bf1e92f606b3"). InnerVolumeSpecName "kube-api-access-4b7gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.649948 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b7gd\" (UniqueName: \"kubernetes.io/projected/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3-kube-api-access-4b7gd\") on node \"crc\" DevicePath \"\"" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.974607 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vlqm5-config-48fln" event={"ID":"7198cc91-a15e-4ffb-bfb1-bf1e92f606b3","Type":"ContainerDied","Data":"118bee4baded7f69d3c392745307f13b85065d95f857d265a95a5f0fb818e52d"} Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.974648 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="118bee4baded7f69d3c392745307f13b85065d95f857d265a95a5f0fb818e52d" Oct 03 11:24:59 crc kubenswrapper[4990]: I1003 11:24:59.974679 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vlqm5-config-48fln" Oct 03 11:25:00 crc kubenswrapper[4990]: I1003 11:25:00.516643 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vlqm5-config-48fln"] Oct 03 11:25:00 crc kubenswrapper[4990]: I1003 11:25:00.541271 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vlqm5-config-48fln"] Oct 03 11:25:00 crc kubenswrapper[4990]: I1003 11:25:00.892810 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" path="/var/lib/kubelet/pods/7198cc91-a15e-4ffb-bfb1-bf1e92f606b3/volumes" Oct 03 11:25:07 crc kubenswrapper[4990]: I1003 11:25:07.047623 4990 generic.go:334] "Generic (PLEG): container finished" podID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerID="7bbcc92b566fed4ee3e664992fc91bbc47ba43822e9fbecd6aec295090bdf855" exitCode=0 Oct 03 11:25:07 crc kubenswrapper[4990]: I1003 11:25:07.047801 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54986ddb77-tpbdc" event={"ID":"0cbe069f-c085-4e49-9e43-66c376f9a465","Type":"ContainerDied","Data":"7bbcc92b566fed4ee3e664992fc91bbc47ba43822e9fbecd6aec295090bdf855"} Oct 03 11:25:08 crc kubenswrapper[4990]: I1003 11:25:08.060021 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54986ddb77-tpbdc" event={"ID":"0cbe069f-c085-4e49-9e43-66c376f9a465","Type":"ContainerStarted","Data":"bdd5bc4f1b3b775f4212dead38c6f52f7f63c74269c2e8012c7d26fa0e0ff163"} Oct 03 11:25:08 crc kubenswrapper[4990]: I1003 11:25:08.060339 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:25:08 crc kubenswrapper[4990]: I1003 11:25:08.060583 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:25:08 crc kubenswrapper[4990]: I1003 11:25:08.060599 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54986ddb77-tpbdc" event={"ID":"0cbe069f-c085-4e49-9e43-66c376f9a465","Type":"ContainerStarted","Data":"3e8faa3a9ef813728ff839d427a63e5d6a31a83faaa35a0e5ad17a822907bfb9"} Oct 03 11:25:08 crc kubenswrapper[4990]: I1003 11:25:08.091967 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-54986ddb77-tpbdc" podStartSLOduration=3.645369999 podStartE2EDuration="13.091940502s" podCreationTimestamp="2025-10-03 11:24:55 +0000 UTC" firstStartedPulling="2025-10-03 11:24:56.687928319 +0000 UTC m=+6078.484560176" lastFinishedPulling="2025-10-03 11:25:06.134498792 +0000 UTC m=+6087.931130679" observedRunningTime="2025-10-03 11:25:08.084523651 +0000 UTC m=+6089.881155528" watchObservedRunningTime="2025-10-03 11:25:08.091940502 +0000 UTC m=+6089.888572369" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.273084 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k4rfw"] Oct 03 11:25:20 crc kubenswrapper[4990]: E1003 11:25:20.274140 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" containerName="ovn-config" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.274160 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" containerName="ovn-config" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.274390 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7198cc91-a15e-4ffb-bfb1-bf1e92f606b3" containerName="ovn-config" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.276054 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.291666 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4rfw"] Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.473298 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-utilities\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.473367 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwgq\" (UniqueName: \"kubernetes.io/projected/6fefd121-c58a-4df0-a600-b8072a265760-kube-api-access-5xwgq\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.474180 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-catalog-content\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.576376 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-catalog-content\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.576594 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-utilities\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.576636 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwgq\" (UniqueName: \"kubernetes.io/projected/6fefd121-c58a-4df0-a600-b8072a265760-kube-api-access-5xwgq\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.576949 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-catalog-content\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.577184 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-utilities\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.603594 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwgq\" (UniqueName: \"kubernetes.io/projected/6fefd121-c58a-4df0-a600-b8072a265760-kube-api-access-5xwgq\") pod \"certified-operators-k4rfw\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:20 crc kubenswrapper[4990]: I1003 11:25:20.615698 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:21 crc kubenswrapper[4990]: I1003 11:25:21.155644 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4rfw"] Oct 03 11:25:21 crc kubenswrapper[4990]: I1003 11:25:21.191470 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4rfw" event={"ID":"6fefd121-c58a-4df0-a600-b8072a265760","Type":"ContainerStarted","Data":"0bd2ea7b6844ea941e0c166e3a45f44469362b294bd447344b3ce8b5c5b7b941"} Oct 03 11:25:22 crc kubenswrapper[4990]: I1003 11:25:22.211603 4990 generic.go:334] "Generic (PLEG): container finished" podID="6fefd121-c58a-4df0-a600-b8072a265760" containerID="5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd" exitCode=0 Oct 03 11:25:22 crc kubenswrapper[4990]: I1003 11:25:22.212033 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4rfw" event={"ID":"6fefd121-c58a-4df0-a600-b8072a265760","Type":"ContainerDied","Data":"5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd"} Oct 03 11:25:24 crc kubenswrapper[4990]: I1003 11:25:24.232152 4990 generic.go:334] "Generic (PLEG): container finished" podID="6fefd121-c58a-4df0-a600-b8072a265760" containerID="8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20" exitCode=0 Oct 03 11:25:24 crc kubenswrapper[4990]: I1003 11:25:24.232271 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4rfw" event={"ID":"6fefd121-c58a-4df0-a600-b8072a265760","Type":"ContainerDied","Data":"8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20"} Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.249389 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4rfw" event={"ID":"6fefd121-c58a-4df0-a600-b8072a265760","Type":"ContainerStarted","Data":"8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2"} Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.289704 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k4rfw" podStartSLOduration=2.855065521 podStartE2EDuration="5.28967512s" podCreationTimestamp="2025-10-03 11:25:20 +0000 UTC" firstStartedPulling="2025-10-03 11:25:22.228601432 +0000 UTC m=+6104.025233279" lastFinishedPulling="2025-10-03 11:25:24.663210981 +0000 UTC m=+6106.459842878" observedRunningTime="2025-10-03 11:25:25.2764798 +0000 UTC m=+6107.073111687" watchObservedRunningTime="2025-10-03 11:25:25.28967512 +0000 UTC m=+6107.086307007" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.403036 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-hp5sv"] Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.404972 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.410077 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.410302 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.410472 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.430379 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-hp5sv"] Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.597361 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6d8de341-400b-42bb-b230-f7408b6e9075-hm-ports\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.597422 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6d8de341-400b-42bb-b230-f7408b6e9075-config-data-merged\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.597633 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8de341-400b-42bb-b230-f7408b6e9075-config-data\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.597820 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8de341-400b-42bb-b230-f7408b6e9075-scripts\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.699074 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8de341-400b-42bb-b230-f7408b6e9075-scripts\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.699229 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6d8de341-400b-42bb-b230-f7408b6e9075-hm-ports\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.699260 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6d8de341-400b-42bb-b230-f7408b6e9075-config-data-merged\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.699301 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8de341-400b-42bb-b230-f7408b6e9075-config-data\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.699957 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6d8de341-400b-42bb-b230-f7408b6e9075-config-data-merged\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.700424 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/6d8de341-400b-42bb-b230-f7408b6e9075-hm-ports\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.704940 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8de341-400b-42bb-b230-f7408b6e9075-config-data\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.721176 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8de341-400b-42bb-b230-f7408b6e9075-scripts\") pod \"octavia-rsyslog-hp5sv\" (UID: \"6d8de341-400b-42bb-b230-f7408b6e9075\") " pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:25 crc kubenswrapper[4990]: I1003 11:25:25.723680 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.265053 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-678599687f-hqx9g"] Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.267556 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.272320 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.282538 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-hqx9g"] Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.322178 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-hp5sv"] Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.417986 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b1105cd3-adde-4a69-8ea7-8834c9038f0a-amphora-image\") pod \"octavia-image-upload-678599687f-hqx9g\" (UID: \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\") " pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.418124 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b1105cd3-adde-4a69-8ea7-8834c9038f0a-httpd-config\") pod \"octavia-image-upload-678599687f-hqx9g\" (UID: \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\") " pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.519797 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b1105cd3-adde-4a69-8ea7-8834c9038f0a-amphora-image\") pod \"octavia-image-upload-678599687f-hqx9g\" (UID: \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\") " pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.519908 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b1105cd3-adde-4a69-8ea7-8834c9038f0a-httpd-config\") pod \"octavia-image-upload-678599687f-hqx9g\" (UID: \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\") " pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.520354 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b1105cd3-adde-4a69-8ea7-8834c9038f0a-amphora-image\") pod \"octavia-image-upload-678599687f-hqx9g\" (UID: \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\") " pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.528673 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b1105cd3-adde-4a69-8ea7-8834c9038f0a-httpd-config\") pod \"octavia-image-upload-678599687f-hqx9g\" (UID: \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\") " pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:25:26 crc kubenswrapper[4990]: I1003 11:25:26.598969 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.062573 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-hqx9g"] Oct 03 11:25:27 crc kubenswrapper[4990]: W1003 11:25:27.070670 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1105cd3_adde_4a69_8ea7_8834c9038f0a.slice/crio-9dc0e6cc8915acbcc2a4b726ed394e8155caeaef6a10534302c41f68fedd4791 WatchSource:0}: Error finding container 9dc0e6cc8915acbcc2a4b726ed394e8155caeaef6a10534302c41f68fedd4791: Status 404 returned error can't find the container with id 9dc0e6cc8915acbcc2a4b726ed394e8155caeaef6a10534302c41f68fedd4791 Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.278496 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hqx9g" event={"ID":"b1105cd3-adde-4a69-8ea7-8834c9038f0a","Type":"ContainerStarted","Data":"9dc0e6cc8915acbcc2a4b726ed394e8155caeaef6a10534302c41f68fedd4791"} Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.283132 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-hp5sv" event={"ID":"6d8de341-400b-42bb-b230-f7408b6e9075","Type":"ContainerStarted","Data":"ee0fdc0a167f8dd9dc6c28636229e2199a2fa224250108c9c7dd96fb9e2e2ca8"} Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.347450 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-786b7b89f7-lbh87"] Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.350091 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.356054 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.357153 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.397619 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-786b7b89f7-lbh87"] Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.442826 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-public-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.442878 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/72d51b92-4d67-465b-8e49-41ba2069572f-config-data-merged\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.442901 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-config-data\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.442980 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/72d51b92-4d67-465b-8e49-41ba2069572f-octavia-run\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.443054 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-combined-ca-bundle\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.443085 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-scripts\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.443130 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-ovndb-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.443154 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-internal-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.546041 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-ovndb-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.546095 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-internal-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.546160 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-public-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.546209 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-config-data\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.546228 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/72d51b92-4d67-465b-8e49-41ba2069572f-config-data-merged\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.546277 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/72d51b92-4d67-465b-8e49-41ba2069572f-octavia-run\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.546340 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-combined-ca-bundle\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.546372 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-scripts\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.547809 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/72d51b92-4d67-465b-8e49-41ba2069572f-octavia-run\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.547845 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/72d51b92-4d67-465b-8e49-41ba2069572f-config-data-merged\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.554012 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-internal-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.554322 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-config-data\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.555389 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-scripts\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.556188 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-ovndb-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.556972 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-public-tls-certs\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.558136 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d51b92-4d67-465b-8e49-41ba2069572f-combined-ca-bundle\") pod \"octavia-api-786b7b89f7-lbh87\" (UID: \"72d51b92-4d67-465b-8e49-41ba2069572f\") " pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:27 crc kubenswrapper[4990]: I1003 11:25:27.686779 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:29 crc kubenswrapper[4990]: I1003 11:25:29.088549 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-786b7b89f7-lbh87"] Oct 03 11:25:29 crc kubenswrapper[4990]: W1003 11:25:29.117082 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d51b92_4d67_465b_8e49_41ba2069572f.slice/crio-36788c19b917527b1c213e1af3a16cdb4c121d3c29003335f721de3b5dd4d0c1 WatchSource:0}: Error finding container 36788c19b917527b1c213e1af3a16cdb4c121d3c29003335f721de3b5dd4d0c1: Status 404 returned error can't find the container with id 36788c19b917527b1c213e1af3a16cdb4c121d3c29003335f721de3b5dd4d0c1 Oct 03 11:25:29 crc kubenswrapper[4990]: I1003 11:25:29.315569 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-786b7b89f7-lbh87" event={"ID":"72d51b92-4d67-465b-8e49-41ba2069572f","Type":"ContainerStarted","Data":"36788c19b917527b1c213e1af3a16cdb4c121d3c29003335f721de3b5dd4d0c1"} Oct 03 11:25:30 crc kubenswrapper[4990]: I1003 11:25:30.330275 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-hp5sv" event={"ID":"6d8de341-400b-42bb-b230-f7408b6e9075","Type":"ContainerStarted","Data":"75c6a7a1d8306ce5b1fd04518db00c07802ac09797c2e01e31bc210959e58add"} Oct 03 11:25:30 crc kubenswrapper[4990]: I1003 11:25:30.538569 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:25:30 crc kubenswrapper[4990]: I1003 11:25:30.616843 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:30 crc kubenswrapper[4990]: I1003 11:25:30.616927 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:30 crc kubenswrapper[4990]: I1003 11:25:30.641340 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:25:31 crc kubenswrapper[4990]: I1003 11:25:31.342325 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-786b7b89f7-lbh87" event={"ID":"72d51b92-4d67-465b-8e49-41ba2069572f","Type":"ContainerStarted","Data":"393606ba772275e073270d5ad0be541e411954680364a206727b1030d929843e"} Oct 03 11:25:31 crc kubenswrapper[4990]: I1003 11:25:31.675986 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k4rfw" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="registry-server" probeResult="failure" output=< Oct 03 11:25:31 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 11:25:31 crc kubenswrapper[4990]: > Oct 03 11:25:32 crc kubenswrapper[4990]: I1003 11:25:32.355698 4990 generic.go:334] "Generic (PLEG): container finished" podID="72d51b92-4d67-465b-8e49-41ba2069572f" containerID="393606ba772275e073270d5ad0be541e411954680364a206727b1030d929843e" exitCode=0 Oct 03 11:25:32 crc kubenswrapper[4990]: I1003 11:25:32.355815 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-786b7b89f7-lbh87" event={"ID":"72d51b92-4d67-465b-8e49-41ba2069572f","Type":"ContainerDied","Data":"393606ba772275e073270d5ad0be541e411954680364a206727b1030d929843e"} Oct 03 11:25:32 crc kubenswrapper[4990]: I1003 11:25:32.363011 4990 generic.go:334] "Generic (PLEG): container finished" podID="6d8de341-400b-42bb-b230-f7408b6e9075" containerID="75c6a7a1d8306ce5b1fd04518db00c07802ac09797c2e01e31bc210959e58add" exitCode=0 Oct 03 11:25:32 crc kubenswrapper[4990]: I1003 11:25:32.363056 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-hp5sv" event={"ID":"6d8de341-400b-42bb-b230-f7408b6e9075","Type":"ContainerDied","Data":"75c6a7a1d8306ce5b1fd04518db00c07802ac09797c2e01e31bc210959e58add"} Oct 03 11:25:33 crc kubenswrapper[4990]: I1003 11:25:33.379609 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-786b7b89f7-lbh87" event={"ID":"72d51b92-4d67-465b-8e49-41ba2069572f","Type":"ContainerStarted","Data":"7a4c40efb2b9956432d37db3786c2bc696884b0ddfd9c190d9e194ee7187231f"} Oct 03 11:25:33 crc kubenswrapper[4990]: I1003 11:25:33.379973 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-786b7b89f7-lbh87" event={"ID":"72d51b92-4d67-465b-8e49-41ba2069572f","Type":"ContainerStarted","Data":"802c3d92b745c9a572573a4012bb2dba93a7a24e3e55603eeb13c62f23b8be11"} Oct 03 11:25:33 crc kubenswrapper[4990]: I1003 11:25:33.380367 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:33 crc kubenswrapper[4990]: I1003 11:25:33.412798 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-786b7b89f7-lbh87" podStartSLOduration=6.412781346 podStartE2EDuration="6.412781346s" podCreationTimestamp="2025-10-03 11:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:25:33.407783607 +0000 UTC m=+6115.204415474" watchObservedRunningTime="2025-10-03 11:25:33.412781346 +0000 UTC m=+6115.209413193" Oct 03 11:25:34 crc kubenswrapper[4990]: I1003 11:25:34.390093 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:34 crc kubenswrapper[4990]: I1003 11:25:34.994848 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-clsdg"] Oct 03 11:25:34 crc kubenswrapper[4990]: I1003 11:25:34.998472 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.001075 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.009104 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-clsdg"] Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.131888 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-combined-ca-bundle\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.131937 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data-merged\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.132016 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-scripts\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.132030 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.234170 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-combined-ca-bundle\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.234501 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data-merged\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.234595 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-scripts\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.234613 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.238710 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data-merged\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.244390 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-scripts\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.247073 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.254586 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-combined-ca-bundle\") pod \"octavia-db-sync-clsdg\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:35 crc kubenswrapper[4990]: I1003 11:25:35.324800 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:38 crc kubenswrapper[4990]: I1003 11:25:38.551705 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-clsdg"] Oct 03 11:25:39 crc kubenswrapper[4990]: I1003 11:25:39.443967 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-hp5sv" event={"ID":"6d8de341-400b-42bb-b230-f7408b6e9075","Type":"ContainerStarted","Data":"68bbc617a129f20b855347b2fce6a45621996aa2e012d227a7710e366c40266f"} Oct 03 11:25:39 crc kubenswrapper[4990]: I1003 11:25:39.444738 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:39 crc kubenswrapper[4990]: I1003 11:25:39.446237 4990 generic.go:334] "Generic (PLEG): container finished" podID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" containerID="a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e" exitCode=0 Oct 03 11:25:39 crc kubenswrapper[4990]: I1003 11:25:39.446276 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hqx9g" event={"ID":"b1105cd3-adde-4a69-8ea7-8834c9038f0a","Type":"ContainerDied","Data":"a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e"} Oct 03 11:25:39 crc kubenswrapper[4990]: I1003 11:25:39.448442 4990 generic.go:334] "Generic (PLEG): container finished" podID="f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" containerID="a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217" exitCode=0 Oct 03 11:25:39 crc kubenswrapper[4990]: I1003 11:25:39.448477 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-clsdg" event={"ID":"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07","Type":"ContainerDied","Data":"a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217"} Oct 03 11:25:39 crc kubenswrapper[4990]: I1003 11:25:39.448497 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-clsdg" event={"ID":"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07","Type":"ContainerStarted","Data":"f0654b3524c8d5c09222a3f6ae73e90351145d6d753ff2024c1bfc6f3c7639bd"} Oct 03 11:25:39 crc kubenswrapper[4990]: I1003 11:25:39.463646 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-hp5sv" podStartSLOduration=2.7648973850000003 podStartE2EDuration="14.463629791s" podCreationTimestamp="2025-10-03 11:25:25 +0000 UTC" firstStartedPulling="2025-10-03 11:25:26.344250861 +0000 UTC m=+6108.140882718" lastFinishedPulling="2025-10-03 11:25:38.042983267 +0000 UTC m=+6119.839615124" observedRunningTime="2025-10-03 11:25:39.459877144 +0000 UTC m=+6121.256509031" watchObservedRunningTime="2025-10-03 11:25:39.463629791 +0000 UTC m=+6121.260261648" Oct 03 11:25:40 crc kubenswrapper[4990]: I1003 11:25:40.462451 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hqx9g" event={"ID":"b1105cd3-adde-4a69-8ea7-8834c9038f0a","Type":"ContainerStarted","Data":"2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325"} Oct 03 11:25:40 crc kubenswrapper[4990]: I1003 11:25:40.468737 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-clsdg" event={"ID":"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07","Type":"ContainerStarted","Data":"a3c969f500070b9e0ad1e06107db456ee7c316d723b66879134e39c16dc93f78"} Oct 03 11:25:40 crc kubenswrapper[4990]: I1003 11:25:40.514042 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-678599687f-hqx9g" podStartSLOduration=3.394109019 podStartE2EDuration="14.514022074s" podCreationTimestamp="2025-10-03 11:25:26 +0000 UTC" firstStartedPulling="2025-10-03 11:25:27.072695041 +0000 UTC m=+6108.869326898" lastFinishedPulling="2025-10-03 11:25:38.192608096 +0000 UTC m=+6119.989239953" observedRunningTime="2025-10-03 11:25:40.510182765 +0000 UTC m=+6122.306814702" watchObservedRunningTime="2025-10-03 11:25:40.514022074 +0000 UTC m=+6122.310653931" Oct 03 11:25:40 crc kubenswrapper[4990]: I1003 11:25:40.544052 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-clsdg" podStartSLOduration=6.544028388 podStartE2EDuration="6.544028388s" podCreationTimestamp="2025-10-03 11:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:25:40.532645724 +0000 UTC m=+6122.329277571" watchObservedRunningTime="2025-10-03 11:25:40.544028388 +0000 UTC m=+6122.340660245" Oct 03 11:25:41 crc kubenswrapper[4990]: I1003 11:25:41.675809 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k4rfw" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="registry-server" probeResult="failure" output=< Oct 03 11:25:41 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 11:25:41 crc kubenswrapper[4990]: > Oct 03 11:25:42 crc kubenswrapper[4990]: I1003 11:25:42.052680 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-79nsj"] Oct 03 11:25:42 crc kubenswrapper[4990]: I1003 11:25:42.065849 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-79nsj"] Oct 03 11:25:42 crc kubenswrapper[4990]: I1003 11:25:42.890238 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3" path="/var/lib/kubelet/pods/9e806fb6-53b2-4f3f-b9de-c9b1622c8fb3/volumes" Oct 03 11:25:42 crc kubenswrapper[4990]: E1003 11:25:42.978041 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b7e197_3a7b_4f72_aaa5_6ec3d5411f07.slice/crio-conmon-a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:25:45 crc kubenswrapper[4990]: I1003 11:25:45.528575 4990 generic.go:334] "Generic (PLEG): container finished" podID="f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" containerID="a3c969f500070b9e0ad1e06107db456ee7c316d723b66879134e39c16dc93f78" exitCode=0 Oct 03 11:25:45 crc kubenswrapper[4990]: I1003 11:25:45.528650 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-clsdg" event={"ID":"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07","Type":"ContainerDied","Data":"a3c969f500070b9e0ad1e06107db456ee7c316d723b66879134e39c16dc93f78"} Oct 03 11:25:46 crc kubenswrapper[4990]: I1003 11:25:46.986033 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.003539 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.010107 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-786b7b89f7-lbh87" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.099269 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-54986ddb77-tpbdc"] Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.099835 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-54986ddb77-tpbdc" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="octavia-api" containerID="cri-o://3e8faa3a9ef813728ff839d427a63e5d6a31a83faaa35a0e5ad17a822907bfb9" gracePeriod=30 Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.100318 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-54986ddb77-tpbdc" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="octavia-api-provider-agent" containerID="cri-o://bdd5bc4f1b3b775f4212dead38c6f52f7f63c74269c2e8012c7d26fa0e0ff163" gracePeriod=30 Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.120286 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data\") pod \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.120478 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-scripts\") pod \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.120527 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data-merged\") pod \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.120642 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-combined-ca-bundle\") pod \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\" (UID: \"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07\") " Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.145780 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data" (OuterVolumeSpecName: "config-data") pod "f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" (UID: "f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.145897 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-scripts" (OuterVolumeSpecName: "scripts") pod "f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" (UID: "f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.165678 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" (UID: "f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.172639 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" (UID: "f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.223341 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.223600 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.223612 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.223621 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.549331 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-clsdg" event={"ID":"f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07","Type":"ContainerDied","Data":"f0654b3524c8d5c09222a3f6ae73e90351145d6d753ff2024c1bfc6f3c7639bd"} Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.549372 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0654b3524c8d5c09222a3f6ae73e90351145d6d753ff2024c1bfc6f3c7639bd" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.549346 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-clsdg" Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.551736 4990 generic.go:334] "Generic (PLEG): container finished" podID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerID="bdd5bc4f1b3b775f4212dead38c6f52f7f63c74269c2e8012c7d26fa0e0ff163" exitCode=0 Oct 03 11:25:47 crc kubenswrapper[4990]: I1003 11:25:47.551834 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54986ddb77-tpbdc" event={"ID":"0cbe069f-c085-4e49-9e43-66c376f9a465","Type":"ContainerDied","Data":"bdd5bc4f1b3b775f4212dead38c6f52f7f63c74269c2e8012c7d26fa0e0ff163"} Oct 03 11:25:49 crc kubenswrapper[4990]: I1003 11:25:49.780477 4990 scope.go:117] "RemoveContainer" containerID="5508dd7652b00f7f994143650fab7a1b975be2dd4c9cfcd4737414eafbe938cb" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.586570 4990 generic.go:334] "Generic (PLEG): container finished" podID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerID="3e8faa3a9ef813728ff839d427a63e5d6a31a83faaa35a0e5ad17a822907bfb9" exitCode=0 Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.586666 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54986ddb77-tpbdc" event={"ID":"0cbe069f-c085-4e49-9e43-66c376f9a465","Type":"ContainerDied","Data":"3e8faa3a9ef813728ff839d427a63e5d6a31a83faaa35a0e5ad17a822907bfb9"} Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.686979 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.742023 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.744525 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.899585 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-octavia-run\") pod \"0cbe069f-c085-4e49-9e43-66c376f9a465\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.899668 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-combined-ca-bundle\") pod \"0cbe069f-c085-4e49-9e43-66c376f9a465\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.899703 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data-merged\") pod \"0cbe069f-c085-4e49-9e43-66c376f9a465\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.899750 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-ovndb-tls-certs\") pod \"0cbe069f-c085-4e49-9e43-66c376f9a465\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.899836 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-scripts\") pod \"0cbe069f-c085-4e49-9e43-66c376f9a465\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.899872 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data\") pod \"0cbe069f-c085-4e49-9e43-66c376f9a465\" (UID: \"0cbe069f-c085-4e49-9e43-66c376f9a465\") " Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.900226 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "0cbe069f-c085-4e49-9e43-66c376f9a465" (UID: "0cbe069f-c085-4e49-9e43-66c376f9a465"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.901567 4990 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-octavia-run\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.906044 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data" (OuterVolumeSpecName: "config-data") pod "0cbe069f-c085-4e49-9e43-66c376f9a465" (UID: "0cbe069f-c085-4e49-9e43-66c376f9a465"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.914025 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-scripts" (OuterVolumeSpecName: "scripts") pod "0cbe069f-c085-4e49-9e43-66c376f9a465" (UID: "0cbe069f-c085-4e49-9e43-66c376f9a465"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.956386 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "0cbe069f-c085-4e49-9e43-66c376f9a465" (UID: "0cbe069f-c085-4e49-9e43-66c376f9a465"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:25:50 crc kubenswrapper[4990]: I1003 11:25:50.961484 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cbe069f-c085-4e49-9e43-66c376f9a465" (UID: "0cbe069f-c085-4e49-9e43-66c376f9a465"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.003840 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.003881 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.003893 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.003905 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0cbe069f-c085-4e49-9e43-66c376f9a465-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.089811 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0cbe069f-c085-4e49-9e43-66c376f9a465" (UID: "0cbe069f-c085-4e49-9e43-66c376f9a465"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.105558 4990 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cbe069f-c085-4e49-9e43-66c376f9a465-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.475245 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4rfw"] Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.599025 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-54986ddb77-tpbdc" event={"ID":"0cbe069f-c085-4e49-9e43-66c376f9a465","Type":"ContainerDied","Data":"b389b19422536bf62f528f18f4dbaf233612e02ba6f85bf399f440bc97686ccc"} Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.599082 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-54986ddb77-tpbdc" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.599399 4990 scope.go:117] "RemoveContainer" containerID="bdd5bc4f1b3b775f4212dead38c6f52f7f63c74269c2e8012c7d26fa0e0ff163" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.626786 4990 scope.go:117] "RemoveContainer" containerID="3e8faa3a9ef813728ff839d427a63e5d6a31a83faaa35a0e5ad17a822907bfb9" Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.648493 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-54986ddb77-tpbdc"] Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.657316 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-54986ddb77-tpbdc"] Oct 03 11:25:51 crc kubenswrapper[4990]: I1003 11:25:51.660863 4990 scope.go:117] "RemoveContainer" containerID="7bbcc92b566fed4ee3e664992fc91bbc47ba43822e9fbecd6aec295090bdf855" Oct 03 11:25:52 crc kubenswrapper[4990]: I1003 11:25:52.611015 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k4rfw" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="registry-server" containerID="cri-o://8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2" gracePeriod=2 Oct 03 11:25:52 crc kubenswrapper[4990]: I1003 11:25:52.905036 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" path="/var/lib/kubelet/pods/0cbe069f-c085-4e49-9e43-66c376f9a465/volumes" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.185666 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:53 crc kubenswrapper[4990]: E1003 11:25:53.258085 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b7e197_3a7b_4f72_aaa5_6ec3d5411f07.slice/crio-conmon-a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.353741 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-utilities\") pod \"6fefd121-c58a-4df0-a600-b8072a265760\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.354145 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-catalog-content\") pod \"6fefd121-c58a-4df0-a600-b8072a265760\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.354352 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xwgq\" (UniqueName: \"kubernetes.io/projected/6fefd121-c58a-4df0-a600-b8072a265760-kube-api-access-5xwgq\") pod \"6fefd121-c58a-4df0-a600-b8072a265760\" (UID: \"6fefd121-c58a-4df0-a600-b8072a265760\") " Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.354489 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-utilities" (OuterVolumeSpecName: "utilities") pod "6fefd121-c58a-4df0-a600-b8072a265760" (UID: "6fefd121-c58a-4df0-a600-b8072a265760"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.354845 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.359741 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fefd121-c58a-4df0-a600-b8072a265760-kube-api-access-5xwgq" (OuterVolumeSpecName: "kube-api-access-5xwgq") pod "6fefd121-c58a-4df0-a600-b8072a265760" (UID: "6fefd121-c58a-4df0-a600-b8072a265760"). InnerVolumeSpecName "kube-api-access-5xwgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.391674 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fefd121-c58a-4df0-a600-b8072a265760" (UID: "6fefd121-c58a-4df0-a600-b8072a265760"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.456847 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fefd121-c58a-4df0-a600-b8072a265760-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.456879 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xwgq\" (UniqueName: \"kubernetes.io/projected/6fefd121-c58a-4df0-a600-b8072a265760-kube-api-access-5xwgq\") on node \"crc\" DevicePath \"\"" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.624568 4990 generic.go:334] "Generic (PLEG): container finished" podID="6fefd121-c58a-4df0-a600-b8072a265760" containerID="8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2" exitCode=0 Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.624616 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4rfw" event={"ID":"6fefd121-c58a-4df0-a600-b8072a265760","Type":"ContainerDied","Data":"8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2"} Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.624647 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4rfw" event={"ID":"6fefd121-c58a-4df0-a600-b8072a265760","Type":"ContainerDied","Data":"0bd2ea7b6844ea941e0c166e3a45f44469362b294bd447344b3ce8b5c5b7b941"} Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.624669 4990 scope.go:117] "RemoveContainer" containerID="8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.624815 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4rfw" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.660904 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4rfw"] Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.663536 4990 scope.go:117] "RemoveContainer" containerID="8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.669681 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k4rfw"] Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.685088 4990 scope.go:117] "RemoveContainer" containerID="5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.746265 4990 scope.go:117] "RemoveContainer" containerID="8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2" Oct 03 11:25:53 crc kubenswrapper[4990]: E1003 11:25:53.748032 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2\": container with ID starting with 8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2 not found: ID does not exist" containerID="8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.748077 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2"} err="failed to get container status \"8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2\": rpc error: code = NotFound desc = could not find container \"8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2\": container with ID starting with 8a6c8bf2f163b715ace79d62d5889ea4f8f2b129a468ecd0cc33bd21e4433de2 not found: ID does not exist" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.748095 4990 scope.go:117] "RemoveContainer" containerID="8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20" Oct 03 11:25:53 crc kubenswrapper[4990]: E1003 11:25:53.748453 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20\": container with ID starting with 8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20 not found: ID does not exist" containerID="8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.748472 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20"} err="failed to get container status \"8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20\": rpc error: code = NotFound desc = could not find container \"8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20\": container with ID starting with 8b84f6cad3dbe113e8d80552392dcf2b9647fc22dcab6c27f887e5cb1c643d20 not found: ID does not exist" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.748499 4990 scope.go:117] "RemoveContainer" containerID="5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd" Oct 03 11:25:53 crc kubenswrapper[4990]: E1003 11:25:53.748735 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd\": container with ID starting with 5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd not found: ID does not exist" containerID="5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd" Oct 03 11:25:53 crc kubenswrapper[4990]: I1003 11:25:53.748844 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd"} err="failed to get container status \"5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd\": rpc error: code = NotFound desc = could not find container \"5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd\": container with ID starting with 5df45ff157fba5da53c5ae70842ea3ce0fc98ab202c31e21f00bdc455a8570fd not found: ID does not exist" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.291448 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w8jzb"] Oct 03 11:25:54 crc kubenswrapper[4990]: E1003 11:25:54.291946 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="extract-utilities" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.291973 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="extract-utilities" Oct 03 11:25:54 crc kubenswrapper[4990]: E1003 11:25:54.291991 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="init" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292001 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="init" Oct 03 11:25:54 crc kubenswrapper[4990]: E1003 11:25:54.292017 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" containerName="octavia-db-sync" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292029 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" containerName="octavia-db-sync" Oct 03 11:25:54 crc kubenswrapper[4990]: E1003 11:25:54.292057 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="registry-server" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292067 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="registry-server" Oct 03 11:25:54 crc kubenswrapper[4990]: E1003 11:25:54.292087 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" containerName="init" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292095 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" containerName="init" Oct 03 11:25:54 crc kubenswrapper[4990]: E1003 11:25:54.292124 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="extract-content" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292131 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="extract-content" Oct 03 11:25:54 crc kubenswrapper[4990]: E1003 11:25:54.292148 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="octavia-api" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292157 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="octavia-api" Oct 03 11:25:54 crc kubenswrapper[4990]: E1003 11:25:54.292180 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="octavia-api-provider-agent" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292188 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="octavia-api-provider-agent" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292407 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="octavia-api" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292441 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbe069f-c085-4e49-9e43-66c376f9a465" containerName="octavia-api-provider-agent" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292455 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" containerName="octavia-db-sync" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.292466 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fefd121-c58a-4df0-a600-b8072a265760" containerName="registry-server" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.294417 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.304610 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8jzb"] Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.476373 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl7mj\" (UniqueName: \"kubernetes.io/projected/0311a26d-dc08-4614-94c5-439c4b9a4a3d-kube-api-access-kl7mj\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.476457 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-catalog-content\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.476607 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-utilities\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.578472 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-utilities\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.578616 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl7mj\" (UniqueName: \"kubernetes.io/projected/0311a26d-dc08-4614-94c5-439c4b9a4a3d-kube-api-access-kl7mj\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.578697 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-catalog-content\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.579098 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-utilities\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.579145 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-catalog-content\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.600251 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl7mj\" (UniqueName: \"kubernetes.io/projected/0311a26d-dc08-4614-94c5-439c4b9a4a3d-kube-api-access-kl7mj\") pod \"community-operators-w8jzb\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.636430 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:25:54 crc kubenswrapper[4990]: I1003 11:25:54.907212 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fefd121-c58a-4df0-a600-b8072a265760" path="/var/lib/kubelet/pods/6fefd121-c58a-4df0-a600-b8072a265760/volumes" Oct 03 11:25:55 crc kubenswrapper[4990]: I1003 11:25:55.196059 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8jzb"] Oct 03 11:25:55 crc kubenswrapper[4990]: I1003 11:25:55.648491 4990 generic.go:334] "Generic (PLEG): container finished" podID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerID="0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e" exitCode=0 Oct 03 11:25:55 crc kubenswrapper[4990]: I1003 11:25:55.648640 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8jzb" event={"ID":"0311a26d-dc08-4614-94c5-439c4b9a4a3d","Type":"ContainerDied","Data":"0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e"} Oct 03 11:25:55 crc kubenswrapper[4990]: I1003 11:25:55.648824 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8jzb" event={"ID":"0311a26d-dc08-4614-94c5-439c4b9a4a3d","Type":"ContainerStarted","Data":"09f43892a373c97dc1be2dcf4c8d70c3185dcebf3432d682fd0548c4213c494c"} Oct 03 11:25:55 crc kubenswrapper[4990]: I1003 11:25:55.753733 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-hp5sv" Oct 03 11:25:57 crc kubenswrapper[4990]: I1003 11:25:57.670449 4990 generic.go:334] "Generic (PLEG): container finished" podID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerID="c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779" exitCode=0 Oct 03 11:25:57 crc kubenswrapper[4990]: I1003 11:25:57.671126 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8jzb" event={"ID":"0311a26d-dc08-4614-94c5-439c4b9a4a3d","Type":"ContainerDied","Data":"c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779"} Oct 03 11:25:58 crc kubenswrapper[4990]: I1003 11:25:58.682461 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8jzb" event={"ID":"0311a26d-dc08-4614-94c5-439c4b9a4a3d","Type":"ContainerStarted","Data":"93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa"} Oct 03 11:25:58 crc kubenswrapper[4990]: I1003 11:25:58.702597 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w8jzb" podStartSLOduration=2.239887205 podStartE2EDuration="4.702576087s" podCreationTimestamp="2025-10-03 11:25:54 +0000 UTC" firstStartedPulling="2025-10-03 11:25:55.650699258 +0000 UTC m=+6137.447331115" lastFinishedPulling="2025-10-03 11:25:58.11338814 +0000 UTC m=+6139.910019997" observedRunningTime="2025-10-03 11:25:58.697934398 +0000 UTC m=+6140.494566275" watchObservedRunningTime="2025-10-03 11:25:58.702576087 +0000 UTC m=+6140.499207944" Oct 03 11:26:00 crc kubenswrapper[4990]: I1003 11:26:00.051666 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bc6c-account-create-zcz47"] Oct 03 11:26:00 crc kubenswrapper[4990]: I1003 11:26:00.060607 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bc6c-account-create-zcz47"] Oct 03 11:26:00 crc kubenswrapper[4990]: I1003 11:26:00.883785 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146d3ad2-f23d-47b9-a404-8b048b59c8fc" path="/var/lib/kubelet/pods/146d3ad2-f23d-47b9-a404-8b048b59c8fc/volumes" Oct 03 11:26:03 crc kubenswrapper[4990]: E1003 11:26:03.525748 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b7e197_3a7b_4f72_aaa5_6ec3d5411f07.slice/crio-conmon-a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:26:04 crc kubenswrapper[4990]: I1003 11:26:04.637242 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:26:04 crc kubenswrapper[4990]: I1003 11:26:04.637295 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:26:04 crc kubenswrapper[4990]: I1003 11:26:04.688119 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:26:04 crc kubenswrapper[4990]: I1003 11:26:04.805435 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:26:04 crc kubenswrapper[4990]: I1003 11:26:04.949418 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8jzb"] Oct 03 11:26:06 crc kubenswrapper[4990]: I1003 11:26:06.037151 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6qnlk"] Oct 03 11:26:06 crc kubenswrapper[4990]: I1003 11:26:06.044299 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6qnlk"] Oct 03 11:26:06 crc kubenswrapper[4990]: I1003 11:26:06.772050 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w8jzb" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerName="registry-server" containerID="cri-o://93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa" gracePeriod=2 Oct 03 11:26:06 crc kubenswrapper[4990]: I1003 11:26:06.895293 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9ecda3-14e3-4135-af9d-e975c262734d" path="/var/lib/kubelet/pods/5f9ecda3-14e3-4135-af9d-e975c262734d/volumes" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.258292 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.349188 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl7mj\" (UniqueName: \"kubernetes.io/projected/0311a26d-dc08-4614-94c5-439c4b9a4a3d-kube-api-access-kl7mj\") pod \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.349626 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-catalog-content\") pod \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.349806 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-utilities\") pod \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\" (UID: \"0311a26d-dc08-4614-94c5-439c4b9a4a3d\") " Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.351002 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-utilities" (OuterVolumeSpecName: "utilities") pod "0311a26d-dc08-4614-94c5-439c4b9a4a3d" (UID: "0311a26d-dc08-4614-94c5-439c4b9a4a3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.354602 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0311a26d-dc08-4614-94c5-439c4b9a4a3d-kube-api-access-kl7mj" (OuterVolumeSpecName: "kube-api-access-kl7mj") pod "0311a26d-dc08-4614-94c5-439c4b9a4a3d" (UID: "0311a26d-dc08-4614-94c5-439c4b9a4a3d"). InnerVolumeSpecName "kube-api-access-kl7mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.396949 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0311a26d-dc08-4614-94c5-439c4b9a4a3d" (UID: "0311a26d-dc08-4614-94c5-439c4b9a4a3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.452159 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.452194 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl7mj\" (UniqueName: \"kubernetes.io/projected/0311a26d-dc08-4614-94c5-439c4b9a4a3d-kube-api-access-kl7mj\") on node \"crc\" DevicePath \"\"" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.452210 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0311a26d-dc08-4614-94c5-439c4b9a4a3d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.786421 4990 generic.go:334] "Generic (PLEG): container finished" podID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerID="93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa" exitCode=0 Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.786493 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8jzb" event={"ID":"0311a26d-dc08-4614-94c5-439c4b9a4a3d","Type":"ContainerDied","Data":"93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa"} Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.786600 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8jzb" event={"ID":"0311a26d-dc08-4614-94c5-439c4b9a4a3d","Type":"ContainerDied","Data":"09f43892a373c97dc1be2dcf4c8d70c3185dcebf3432d682fd0548c4213c494c"} Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.786629 4990 scope.go:117] "RemoveContainer" containerID="93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.786581 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8jzb" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.838348 4990 scope.go:117] "RemoveContainer" containerID="c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.839552 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8jzb"] Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.854896 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w8jzb"] Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.881847 4990 scope.go:117] "RemoveContainer" containerID="0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.913686 4990 scope.go:117] "RemoveContainer" containerID="93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa" Oct 03 11:26:07 crc kubenswrapper[4990]: E1003 11:26:07.915612 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa\": container with ID starting with 93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa not found: ID does not exist" containerID="93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.915719 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa"} err="failed to get container status \"93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa\": rpc error: code = NotFound desc = could not find container \"93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa\": container with ID starting with 93dafb02e6df1ace9dc11162caf576936005b670ebcbf938a25642899fc1dcaa not found: ID does not exist" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.915882 4990 scope.go:117] "RemoveContainer" containerID="c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779" Oct 03 11:26:07 crc kubenswrapper[4990]: E1003 11:26:07.917138 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779\": container with ID starting with c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779 not found: ID does not exist" containerID="c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.917262 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779"} err="failed to get container status \"c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779\": rpc error: code = NotFound desc = could not find container \"c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779\": container with ID starting with c01b11f363b1add1826f4e71cf1ed25a8db19aae0d4f510836f3b9060b552779 not found: ID does not exist" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.917292 4990 scope.go:117] "RemoveContainer" containerID="0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e" Oct 03 11:26:07 crc kubenswrapper[4990]: E1003 11:26:07.917572 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e\": container with ID starting with 0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e not found: ID does not exist" containerID="0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e" Oct 03 11:26:07 crc kubenswrapper[4990]: I1003 11:26:07.917601 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e"} err="failed to get container status \"0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e\": rpc error: code = NotFound desc = could not find container \"0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e\": container with ID starting with 0b1cf0af69cd2d87bbe8d53ef9d3bdb3a9f601d5355dfaa964b80ab62e491e0e not found: ID does not exist" Oct 03 11:26:08 crc kubenswrapper[4990]: I1003 11:26:08.894898 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" path="/var/lib/kubelet/pods/0311a26d-dc08-4614-94c5-439c4b9a4a3d/volumes" Oct 03 11:26:08 crc kubenswrapper[4990]: I1003 11:26:08.968358 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-678599687f-hqx9g"] Oct 03 11:26:08 crc kubenswrapper[4990]: I1003 11:26:08.968917 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-678599687f-hqx9g" podUID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" containerName="octavia-amphora-httpd" containerID="cri-o://2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325" gracePeriod=30 Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.573195 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.697254 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b1105cd3-adde-4a69-8ea7-8834c9038f0a-httpd-config\") pod \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\" (UID: \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\") " Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.697349 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b1105cd3-adde-4a69-8ea7-8834c9038f0a-amphora-image\") pod \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\" (UID: \"b1105cd3-adde-4a69-8ea7-8834c9038f0a\") " Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.740040 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1105cd3-adde-4a69-8ea7-8834c9038f0a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b1105cd3-adde-4a69-8ea7-8834c9038f0a" (UID: "b1105cd3-adde-4a69-8ea7-8834c9038f0a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.800144 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b1105cd3-adde-4a69-8ea7-8834c9038f0a-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.803743 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1105cd3-adde-4a69-8ea7-8834c9038f0a-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "b1105cd3-adde-4a69-8ea7-8834c9038f0a" (UID: "b1105cd3-adde-4a69-8ea7-8834c9038f0a"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.812214 4990 generic.go:334] "Generic (PLEG): container finished" podID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" containerID="2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325" exitCode=0 Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.812266 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hqx9g" event={"ID":"b1105cd3-adde-4a69-8ea7-8834c9038f0a","Type":"ContainerDied","Data":"2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325"} Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.812294 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-hqx9g" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.812315 4990 scope.go:117] "RemoveContainer" containerID="2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.812303 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-hqx9g" event={"ID":"b1105cd3-adde-4a69-8ea7-8834c9038f0a","Type":"ContainerDied","Data":"9dc0e6cc8915acbcc2a4b726ed394e8155caeaef6a10534302c41f68fedd4791"} Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.845213 4990 scope.go:117] "RemoveContainer" containerID="a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.852976 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-678599687f-hqx9g"] Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.860552 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-678599687f-hqx9g"] Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.880343 4990 scope.go:117] "RemoveContainer" containerID="2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325" Oct 03 11:26:09 crc kubenswrapper[4990]: E1003 11:26:09.880941 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325\": container with ID starting with 2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325 not found: ID does not exist" containerID="2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.880993 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325"} err="failed to get container status \"2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325\": rpc error: code = NotFound desc = could not find container \"2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325\": container with ID starting with 2e53075ebc80f53a202c62e605d13974c1c40b1b1fddc0bd84715258943ce325 not found: ID does not exist" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.881015 4990 scope.go:117] "RemoveContainer" containerID="a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e" Oct 03 11:26:09 crc kubenswrapper[4990]: E1003 11:26:09.881437 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e\": container with ID starting with a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e not found: ID does not exist" containerID="a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.881481 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e"} err="failed to get container status \"a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e\": rpc error: code = NotFound desc = could not find container \"a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e\": container with ID starting with a616764f56f2a7fcad7ada358c195709b10ffc1b173feab55d08a367115e856e not found: ID does not exist" Oct 03 11:26:09 crc kubenswrapper[4990]: I1003 11:26:09.902375 4990 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b1105cd3-adde-4a69-8ea7-8834c9038f0a-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 03 11:26:10 crc kubenswrapper[4990]: I1003 11:26:10.888146 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" path="/var/lib/kubelet/pods/b1105cd3-adde-4a69-8ea7-8834c9038f0a/volumes" Oct 03 11:26:13 crc kubenswrapper[4990]: E1003 11:26:13.754648 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b7e197_3a7b_4f72_aaa5_6ec3d5411f07.slice/crio-conmon-a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.941737 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-678599687f-nzgf2"] Oct 03 11:26:14 crc kubenswrapper[4990]: E1003 11:26:14.942735 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerName="extract-content" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.942755 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerName="extract-content" Oct 03 11:26:14 crc kubenswrapper[4990]: E1003 11:26:14.942781 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerName="registry-server" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.942792 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerName="registry-server" Oct 03 11:26:14 crc kubenswrapper[4990]: E1003 11:26:14.942807 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" containerName="init" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.942816 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" containerName="init" Oct 03 11:26:14 crc kubenswrapper[4990]: E1003 11:26:14.942830 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" containerName="octavia-amphora-httpd" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.942840 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" containerName="octavia-amphora-httpd" Oct 03 11:26:14 crc kubenswrapper[4990]: E1003 11:26:14.942880 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerName="extract-utilities" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.942890 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerName="extract-utilities" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.943205 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1105cd3-adde-4a69-8ea7-8834c9038f0a" containerName="octavia-amphora-httpd" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.943239 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0311a26d-dc08-4614-94c5-439c4b9a4a3d" containerName="registry-server" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.945191 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-nzgf2" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.949207 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 03 11:26:14 crc kubenswrapper[4990]: I1003 11:26:14.955384 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-nzgf2"] Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.111586 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c3095d7b-e123-45ac-873c-6bf4b14b1f21-amphora-image\") pod \"octavia-image-upload-678599687f-nzgf2\" (UID: \"c3095d7b-e123-45ac-873c-6bf4b14b1f21\") " pod="openstack/octavia-image-upload-678599687f-nzgf2" Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.112155 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3095d7b-e123-45ac-873c-6bf4b14b1f21-httpd-config\") pod \"octavia-image-upload-678599687f-nzgf2\" (UID: \"c3095d7b-e123-45ac-873c-6bf4b14b1f21\") " pod="openstack/octavia-image-upload-678599687f-nzgf2" Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.213978 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c3095d7b-e123-45ac-873c-6bf4b14b1f21-amphora-image\") pod \"octavia-image-upload-678599687f-nzgf2\" (UID: \"c3095d7b-e123-45ac-873c-6bf4b14b1f21\") " pod="openstack/octavia-image-upload-678599687f-nzgf2" Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.214069 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3095d7b-e123-45ac-873c-6bf4b14b1f21-httpd-config\") pod \"octavia-image-upload-678599687f-nzgf2\" (UID: \"c3095d7b-e123-45ac-873c-6bf4b14b1f21\") " pod="openstack/octavia-image-upload-678599687f-nzgf2" Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.215372 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/c3095d7b-e123-45ac-873c-6bf4b14b1f21-amphora-image\") pod \"octavia-image-upload-678599687f-nzgf2\" (UID: \"c3095d7b-e123-45ac-873c-6bf4b14b1f21\") " pod="openstack/octavia-image-upload-678599687f-nzgf2" Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.225550 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c3095d7b-e123-45ac-873c-6bf4b14b1f21-httpd-config\") pod \"octavia-image-upload-678599687f-nzgf2\" (UID: \"c3095d7b-e123-45ac-873c-6bf4b14b1f21\") " pod="openstack/octavia-image-upload-678599687f-nzgf2" Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.281824 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-678599687f-nzgf2" Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.754239 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-678599687f-nzgf2"] Oct 03 11:26:15 crc kubenswrapper[4990]: I1003 11:26:15.879593 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-nzgf2" event={"ID":"c3095d7b-e123-45ac-873c-6bf4b14b1f21","Type":"ContainerStarted","Data":"4499f7da4d138d5eb5fa9858fb2afccb0e33b9a873d3c1537bbb332384bab2a1"} Oct 03 11:26:16 crc kubenswrapper[4990]: I1003 11:26:16.891701 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-nzgf2" event={"ID":"c3095d7b-e123-45ac-873c-6bf4b14b1f21","Type":"ContainerStarted","Data":"638407f96d602e9beb6daaad87c6fbb7f8f3b12392f44aec5b940bc3ed2e598b"} Oct 03 11:26:17 crc kubenswrapper[4990]: I1003 11:26:17.903238 4990 generic.go:334] "Generic (PLEG): container finished" podID="c3095d7b-e123-45ac-873c-6bf4b14b1f21" containerID="638407f96d602e9beb6daaad87c6fbb7f8f3b12392f44aec5b940bc3ed2e598b" exitCode=0 Oct 03 11:26:17 crc kubenswrapper[4990]: I1003 11:26:17.903346 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-nzgf2" event={"ID":"c3095d7b-e123-45ac-873c-6bf4b14b1f21","Type":"ContainerDied","Data":"638407f96d602e9beb6daaad87c6fbb7f8f3b12392f44aec5b940bc3ed2e598b"} Oct 03 11:26:18 crc kubenswrapper[4990]: I1003 11:26:18.918572 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-678599687f-nzgf2" event={"ID":"c3095d7b-e123-45ac-873c-6bf4b14b1f21","Type":"ContainerStarted","Data":"929916a59bd6fe796a3882f0cc8250b93a89e12fee9959ba1102742382b58be9"} Oct 03 11:26:18 crc kubenswrapper[4990]: I1003 11:26:18.938496 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-678599687f-nzgf2" podStartSLOduration=4.345676259 podStartE2EDuration="4.9384681s" podCreationTimestamp="2025-10-03 11:26:14 +0000 UTC" firstStartedPulling="2025-10-03 11:26:15.764030198 +0000 UTC m=+6157.560662045" lastFinishedPulling="2025-10-03 11:26:16.356822029 +0000 UTC m=+6158.153453886" observedRunningTime="2025-10-03 11:26:18.932135547 +0000 UTC m=+6160.728767414" watchObservedRunningTime="2025-10-03 11:26:18.9384681 +0000 UTC m=+6160.735099967" Oct 03 11:26:24 crc kubenswrapper[4990]: E1003 11:26:24.018910 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b7e197_3a7b_4f72_aaa5_6ec3d5411f07.slice/crio-conmon-a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:26:25 crc kubenswrapper[4990]: I1003 11:26:25.304499 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:26:25 crc kubenswrapper[4990]: I1003 11:26:25.305185 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.182819 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-hdwv4"] Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.185369 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.191822 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.192417 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.192936 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.199061 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-hdwv4"] Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.230468 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-config-data\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.230599 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-combined-ca-bundle\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.230641 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e4bb434c-f4bf-4f15-b354-e10c0c481296-hm-ports\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.230664 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-scripts\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.230731 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e4bb434c-f4bf-4f15-b354-e10c0c481296-config-data-merged\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.230763 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-amphora-certs\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.332355 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-config-data\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.332812 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-combined-ca-bundle\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.332898 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e4bb434c-f4bf-4f15-b354-e10c0c481296-hm-ports\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.332934 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-scripts\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.333028 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e4bb434c-f4bf-4f15-b354-e10c0c481296-config-data-merged\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.333078 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-amphora-certs\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.335843 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e4bb434c-f4bf-4f15-b354-e10c0c481296-hm-ports\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.335963 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e4bb434c-f4bf-4f15-b354-e10c0c481296-config-data-merged\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.339105 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-combined-ca-bundle\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.339129 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-amphora-certs\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.339381 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-config-data\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.348337 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4bb434c-f4bf-4f15-b354-e10c0c481296-scripts\") pod \"octavia-healthmanager-hdwv4\" (UID: \"e4bb434c-f4bf-4f15-b354-e10c0c481296\") " pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:30 crc kubenswrapper[4990]: I1003 11:26:30.511998 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.243432 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-hdwv4"] Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.793540 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-wlmv4"] Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.799195 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.806896 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.806928 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.813452 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-wlmv4"] Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.911020 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-config-data\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.911099 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-combined-ca-bundle\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.911215 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/925949a3-bcd4-45b8-ad30-1a86bec81360-config-data-merged\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.911258 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-amphora-certs\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.911440 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-scripts\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:31 crc kubenswrapper[4990]: I1003 11:26:31.911664 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/925949a3-bcd4-45b8-ad30-1a86bec81360-hm-ports\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.016137 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/925949a3-bcd4-45b8-ad30-1a86bec81360-config-data-merged\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.016214 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-amphora-certs\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.016341 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-scripts\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.016392 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/925949a3-bcd4-45b8-ad30-1a86bec81360-hm-ports\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.016621 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-config-data\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.016671 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-combined-ca-bundle\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.016625 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/925949a3-bcd4-45b8-ad30-1a86bec81360-config-data-merged\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.018465 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/925949a3-bcd4-45b8-ad30-1a86bec81360-hm-ports\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.024486 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-config-data\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.025196 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-combined-ca-bundle\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.025522 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-scripts\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.025687 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/925949a3-bcd4-45b8-ad30-1a86bec81360-amphora-certs\") pod \"octavia-housekeeping-wlmv4\" (UID: \"925949a3-bcd4-45b8-ad30-1a86bec81360\") " pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.061325 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hdwv4" event={"ID":"e4bb434c-f4bf-4f15-b354-e10c0c481296","Type":"ContainerStarted","Data":"be35469a8f0a86123c7f5b2fb60e3a70858c7c0b1c4967916bc2bd69b757d488"} Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.061377 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hdwv4" event={"ID":"e4bb434c-f4bf-4f15-b354-e10c0c481296","Type":"ContainerStarted","Data":"fa396be110cf7eea1cd9a1ce9e059d29636107e7d21db0eada6a85c24d770f52"} Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.125475 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:32 crc kubenswrapper[4990]: I1003 11:26:32.670794 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-wlmv4"] Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.070899 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-wlmv4" event={"ID":"925949a3-bcd4-45b8-ad30-1a86bec81360","Type":"ContainerStarted","Data":"aa8ca122127db3704e78f29ac15b8db909422105e2de597455a086c0173eb427"} Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.589849 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-v2rc9"] Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.591768 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.595611 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.595756 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.601229 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-v2rc9"] Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.750526 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-scripts\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.750578 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-config-data\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.750615 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-amphora-certs\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.750659 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-combined-ca-bundle\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.750700 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/23a35a64-97d4-407e-8505-9eb966f5932f-hm-ports\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.750740 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/23a35a64-97d4-407e-8505-9eb966f5932f-config-data-merged\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.851956 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-scripts\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.852019 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-config-data\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.852056 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-amphora-certs\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.853184 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-combined-ca-bundle\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.853259 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/23a35a64-97d4-407e-8505-9eb966f5932f-hm-ports\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.853313 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/23a35a64-97d4-407e-8505-9eb966f5932f-config-data-merged\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.853779 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/23a35a64-97d4-407e-8505-9eb966f5932f-config-data-merged\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.854386 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/23a35a64-97d4-407e-8505-9eb966f5932f-hm-ports\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.858094 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-amphora-certs\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.859072 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-combined-ca-bundle\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.859074 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-scripts\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.860409 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a35a64-97d4-407e-8505-9eb966f5932f-config-data\") pod \"octavia-worker-v2rc9\" (UID: \"23a35a64-97d4-407e-8505-9eb966f5932f\") " pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:33 crc kubenswrapper[4990]: I1003 11:26:33.919734 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:34 crc kubenswrapper[4990]: I1003 11:26:34.138250 4990 generic.go:334] "Generic (PLEG): container finished" podID="e4bb434c-f4bf-4f15-b354-e10c0c481296" containerID="be35469a8f0a86123c7f5b2fb60e3a70858c7c0b1c4967916bc2bd69b757d488" exitCode=0 Oct 03 11:26:34 crc kubenswrapper[4990]: I1003 11:26:34.138288 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hdwv4" event={"ID":"e4bb434c-f4bf-4f15-b354-e10c0c481296","Type":"ContainerDied","Data":"be35469a8f0a86123c7f5b2fb60e3a70858c7c0b1c4967916bc2bd69b757d488"} Oct 03 11:26:34 crc kubenswrapper[4990]: E1003 11:26:34.251116 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b7e197_3a7b_4f72_aaa5_6ec3d5411f07.slice/crio-conmon-a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:26:34 crc kubenswrapper[4990]: I1003 11:26:34.506094 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-v2rc9"] Oct 03 11:26:35 crc kubenswrapper[4990]: W1003 11:26:34.521567 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a35a64_97d4_407e_8505_9eb966f5932f.slice/crio-ec1af34dfd97dbfb318f2fc97d01ae3659ddfe9a5689e4d741754e5a4339bd0e WatchSource:0}: Error finding container ec1af34dfd97dbfb318f2fc97d01ae3659ddfe9a5689e4d741754e5a4339bd0e: Status 404 returned error can't find the container with id ec1af34dfd97dbfb318f2fc97d01ae3659ddfe9a5689e4d741754e5a4339bd0e Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.148563 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-wlmv4" event={"ID":"925949a3-bcd4-45b8-ad30-1a86bec81360","Type":"ContainerStarted","Data":"6e158c9a1e343418a9c5f1f4267e681ebfbf54bfb4dfac38b8eb5856112642c3"} Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.151713 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-v2rc9" event={"ID":"23a35a64-97d4-407e-8505-9eb966f5932f","Type":"ContainerStarted","Data":"ec1af34dfd97dbfb318f2fc97d01ae3659ddfe9a5689e4d741754e5a4339bd0e"} Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.155240 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hdwv4" event={"ID":"e4bb434c-f4bf-4f15-b354-e10c0c481296","Type":"ContainerStarted","Data":"442ed11c91db5481d356dd3ecd5a68199f3dfc3bafbc5dd7229eda57150f95f9"} Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.155563 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.213552 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-hdwv4" podStartSLOduration=5.213417324 podStartE2EDuration="5.213417324s" podCreationTimestamp="2025-10-03 11:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:26:35.212911341 +0000 UTC m=+6177.009543208" watchObservedRunningTime="2025-10-03 11:26:35.213417324 +0000 UTC m=+6177.010049191" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.362100 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkt8d"] Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.364280 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.372224 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkt8d"] Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.499504 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-utilities\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.499573 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9gf6\" (UniqueName: \"kubernetes.io/projected/90bd1b20-c76e-4517-abd0-525f7666c02b-kube-api-access-z9gf6\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.499639 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-catalog-content\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.602008 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-utilities\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.602076 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9gf6\" (UniqueName: \"kubernetes.io/projected/90bd1b20-c76e-4517-abd0-525f7666c02b-kube-api-access-z9gf6\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.602179 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-catalog-content\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.603050 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-catalog-content\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.603119 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-utilities\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.647810 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9gf6\" (UniqueName: \"kubernetes.io/projected/90bd1b20-c76e-4517-abd0-525f7666c02b-kube-api-access-z9gf6\") pod \"redhat-operators-xkt8d\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:35 crc kubenswrapper[4990]: I1003 11:26:35.686465 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:36 crc kubenswrapper[4990]: I1003 11:26:36.044704 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-x4wxk"] Oct 03 11:26:36 crc kubenswrapper[4990]: I1003 11:26:36.053029 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-x4wxk"] Oct 03 11:26:36 crc kubenswrapper[4990]: I1003 11:26:36.164703 4990 generic.go:334] "Generic (PLEG): container finished" podID="925949a3-bcd4-45b8-ad30-1a86bec81360" containerID="6e158c9a1e343418a9c5f1f4267e681ebfbf54bfb4dfac38b8eb5856112642c3" exitCode=0 Oct 03 11:26:36 crc kubenswrapper[4990]: I1003 11:26:36.164782 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-wlmv4" event={"ID":"925949a3-bcd4-45b8-ad30-1a86bec81360","Type":"ContainerDied","Data":"6e158c9a1e343418a9c5f1f4267e681ebfbf54bfb4dfac38b8eb5856112642c3"} Oct 03 11:26:36 crc kubenswrapper[4990]: I1003 11:26:36.834545 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkt8d"] Oct 03 11:26:36 crc kubenswrapper[4990]: I1003 11:26:36.886701 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78cd9c7b-bbc1-4b75-a466-d98ae13d1883" path="/var/lib/kubelet/pods/78cd9c7b-bbc1-4b75-a466-d98ae13d1883/volumes" Oct 03 11:26:37 crc kubenswrapper[4990]: I1003 11:26:37.188292 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-v2rc9" event={"ID":"23a35a64-97d4-407e-8505-9eb966f5932f","Type":"ContainerStarted","Data":"45f11e91a379a2c70cc67924124a6192abd74262ac57b8767343f39184c0ab98"} Oct 03 11:26:37 crc kubenswrapper[4990]: I1003 11:26:37.189881 4990 generic.go:334] "Generic (PLEG): container finished" podID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerID="a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8" exitCode=0 Oct 03 11:26:37 crc kubenswrapper[4990]: I1003 11:26:37.189952 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkt8d" event={"ID":"90bd1b20-c76e-4517-abd0-525f7666c02b","Type":"ContainerDied","Data":"a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8"} Oct 03 11:26:37 crc kubenswrapper[4990]: I1003 11:26:37.189974 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkt8d" event={"ID":"90bd1b20-c76e-4517-abd0-525f7666c02b","Type":"ContainerStarted","Data":"6fb7ca22044f52bafc657fe1427fe1a816fdaf3b5e369703f16ca7b2070c6ca9"} Oct 03 11:26:37 crc kubenswrapper[4990]: I1003 11:26:37.200179 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-wlmv4" event={"ID":"925949a3-bcd4-45b8-ad30-1a86bec81360","Type":"ContainerStarted","Data":"da400e648e27267715e201a79ed6d84cfc8637a1264f2cbcd662ffe5aa7419b9"} Oct 03 11:26:37 crc kubenswrapper[4990]: I1003 11:26:37.200353 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:37 crc kubenswrapper[4990]: I1003 11:26:37.231945 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-wlmv4" podStartSLOduration=4.905081836 podStartE2EDuration="6.231924319s" podCreationTimestamp="2025-10-03 11:26:31 +0000 UTC" firstStartedPulling="2025-10-03 11:26:32.677081473 +0000 UTC m=+6174.473713330" lastFinishedPulling="2025-10-03 11:26:34.003923956 +0000 UTC m=+6175.800555813" observedRunningTime="2025-10-03 11:26:37.228223123 +0000 UTC m=+6179.024854980" watchObservedRunningTime="2025-10-03 11:26:37.231924319 +0000 UTC m=+6179.028556176" Oct 03 11:26:38 crc kubenswrapper[4990]: I1003 11:26:38.213580 4990 generic.go:334] "Generic (PLEG): container finished" podID="23a35a64-97d4-407e-8505-9eb966f5932f" containerID="45f11e91a379a2c70cc67924124a6192abd74262ac57b8767343f39184c0ab98" exitCode=0 Oct 03 11:26:38 crc kubenswrapper[4990]: I1003 11:26:38.213804 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-v2rc9" event={"ID":"23a35a64-97d4-407e-8505-9eb966f5932f","Type":"ContainerDied","Data":"45f11e91a379a2c70cc67924124a6192abd74262ac57b8767343f39184c0ab98"} Oct 03 11:26:39 crc kubenswrapper[4990]: I1003 11:26:39.226631 4990 generic.go:334] "Generic (PLEG): container finished" podID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerID="c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75" exitCode=0 Oct 03 11:26:39 crc kubenswrapper[4990]: I1003 11:26:39.226726 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkt8d" event={"ID":"90bd1b20-c76e-4517-abd0-525f7666c02b","Type":"ContainerDied","Data":"c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75"} Oct 03 11:26:39 crc kubenswrapper[4990]: I1003 11:26:39.231307 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-v2rc9" event={"ID":"23a35a64-97d4-407e-8505-9eb966f5932f","Type":"ContainerStarted","Data":"68022186612d192b5e025c267e8313c40512b1747e0dfaf028f3ae2dd32d56ca"} Oct 03 11:26:39 crc kubenswrapper[4990]: I1003 11:26:39.231460 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:39 crc kubenswrapper[4990]: I1003 11:26:39.274528 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-v2rc9" podStartSLOduration=4.535982601 podStartE2EDuration="6.274473174s" podCreationTimestamp="2025-10-03 11:26:33 +0000 UTC" firstStartedPulling="2025-10-03 11:26:34.524234457 +0000 UTC m=+6176.320866314" lastFinishedPulling="2025-10-03 11:26:36.26272503 +0000 UTC m=+6178.059356887" observedRunningTime="2025-10-03 11:26:39.268330126 +0000 UTC m=+6181.064961983" watchObservedRunningTime="2025-10-03 11:26:39.274473174 +0000 UTC m=+6181.071105081" Oct 03 11:26:40 crc kubenswrapper[4990]: I1003 11:26:40.242868 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkt8d" event={"ID":"90bd1b20-c76e-4517-abd0-525f7666c02b","Type":"ContainerStarted","Data":"db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d"} Oct 03 11:26:40 crc kubenswrapper[4990]: I1003 11:26:40.270590 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkt8d" podStartSLOduration=2.670733758 podStartE2EDuration="5.270569198s" podCreationTimestamp="2025-10-03 11:26:35 +0000 UTC" firstStartedPulling="2025-10-03 11:26:37.192582094 +0000 UTC m=+6178.989213951" lastFinishedPulling="2025-10-03 11:26:39.792417524 +0000 UTC m=+6181.589049391" observedRunningTime="2025-10-03 11:26:40.260454017 +0000 UTC m=+6182.057085884" watchObservedRunningTime="2025-10-03 11:26:40.270569198 +0000 UTC m=+6182.067201055" Oct 03 11:26:45 crc kubenswrapper[4990]: I1003 11:26:45.568943 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-hdwv4" Oct 03 11:26:45 crc kubenswrapper[4990]: I1003 11:26:45.687840 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:45 crc kubenswrapper[4990]: I1003 11:26:45.687906 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:46 crc kubenswrapper[4990]: I1003 11:26:46.030022 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b723-account-create-vl7nv"] Oct 03 11:26:46 crc kubenswrapper[4990]: I1003 11:26:46.039255 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b723-account-create-vl7nv"] Oct 03 11:26:46 crc kubenswrapper[4990]: I1003 11:26:46.737571 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xkt8d" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="registry-server" probeResult="failure" output=< Oct 03 11:26:46 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 11:26:46 crc kubenswrapper[4990]: > Oct 03 11:26:46 crc kubenswrapper[4990]: I1003 11:26:46.882631 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a57ab6-7281-437a-a89b-ea4e74b6cb6c" path="/var/lib/kubelet/pods/f4a57ab6-7281-437a-a89b-ea4e74b6cb6c/volumes" Oct 03 11:26:47 crc kubenswrapper[4990]: I1003 11:26:47.156464 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-wlmv4" Oct 03 11:26:48 crc kubenswrapper[4990]: I1003 11:26:48.951884 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-v2rc9" Oct 03 11:26:49 crc kubenswrapper[4990]: I1003 11:26:49.864616 4990 scope.go:117] "RemoveContainer" containerID="8688690f7c64f8f72db4158ca0da33060ef0a34441c0eea983ea642828f4e5c3" Oct 03 11:26:49 crc kubenswrapper[4990]: I1003 11:26:49.887743 4990 scope.go:117] "RemoveContainer" containerID="b11d11103862df7a7c17e9afe8d302f90948e74a4431307b88d15155d1bf7246" Oct 03 11:26:49 crc kubenswrapper[4990]: I1003 11:26:49.946916 4990 scope.go:117] "RemoveContainer" containerID="5eba6f43018d61a6824deadc2094ff2cb7391095777adbdd5d88502a859b6087" Oct 03 11:26:50 crc kubenswrapper[4990]: I1003 11:26:49.988658 4990 scope.go:117] "RemoveContainer" containerID="4e155499c1209a30f5b907aaef9340aa5a8adf92e8fda1755972c120c85b5bec" Oct 03 11:26:55 crc kubenswrapper[4990]: I1003 11:26:55.303771 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:26:55 crc kubenswrapper[4990]: I1003 11:26:55.304349 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:26:55 crc kubenswrapper[4990]: I1003 11:26:55.745980 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:55 crc kubenswrapper[4990]: I1003 11:26:55.799246 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:55 crc kubenswrapper[4990]: I1003 11:26:55.988283 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkt8d"] Oct 03 11:26:56 crc kubenswrapper[4990]: I1003 11:26:56.028565 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fnnfs"] Oct 03 11:26:56 crc kubenswrapper[4990]: I1003 11:26:56.037243 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fnnfs"] Oct 03 11:26:56 crc kubenswrapper[4990]: I1003 11:26:56.886567 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb20843f-2c82-4d5c-b069-7740e7af9777" path="/var/lib/kubelet/pods/bb20843f-2c82-4d5c-b069-7740e7af9777/volumes" Oct 03 11:26:57 crc kubenswrapper[4990]: I1003 11:26:57.405166 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkt8d" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="registry-server" containerID="cri-o://db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d" gracePeriod=2 Oct 03 11:26:57 crc kubenswrapper[4990]: I1003 11:26:57.973465 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.070112 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9gf6\" (UniqueName: \"kubernetes.io/projected/90bd1b20-c76e-4517-abd0-525f7666c02b-kube-api-access-z9gf6\") pod \"90bd1b20-c76e-4517-abd0-525f7666c02b\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.070232 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-utilities\") pod \"90bd1b20-c76e-4517-abd0-525f7666c02b\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.070408 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-catalog-content\") pod \"90bd1b20-c76e-4517-abd0-525f7666c02b\" (UID: \"90bd1b20-c76e-4517-abd0-525f7666c02b\") " Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.071143 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-utilities" (OuterVolumeSpecName: "utilities") pod "90bd1b20-c76e-4517-abd0-525f7666c02b" (UID: "90bd1b20-c76e-4517-abd0-525f7666c02b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.076290 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90bd1b20-c76e-4517-abd0-525f7666c02b-kube-api-access-z9gf6" (OuterVolumeSpecName: "kube-api-access-z9gf6") pod "90bd1b20-c76e-4517-abd0-525f7666c02b" (UID: "90bd1b20-c76e-4517-abd0-525f7666c02b"). InnerVolumeSpecName "kube-api-access-z9gf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.158435 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90bd1b20-c76e-4517-abd0-525f7666c02b" (UID: "90bd1b20-c76e-4517-abd0-525f7666c02b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.172726 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.172773 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9gf6\" (UniqueName: \"kubernetes.io/projected/90bd1b20-c76e-4517-abd0-525f7666c02b-kube-api-access-z9gf6\") on node \"crc\" DevicePath \"\"" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.172787 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90bd1b20-c76e-4517-abd0-525f7666c02b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.419200 4990 generic.go:334] "Generic (PLEG): container finished" podID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerID="db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d" exitCode=0 Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.419235 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkt8d" event={"ID":"90bd1b20-c76e-4517-abd0-525f7666c02b","Type":"ContainerDied","Data":"db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d"} Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.419271 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkt8d" event={"ID":"90bd1b20-c76e-4517-abd0-525f7666c02b","Type":"ContainerDied","Data":"6fb7ca22044f52bafc657fe1427fe1a816fdaf3b5e369703f16ca7b2070c6ca9"} Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.419291 4990 scope.go:117] "RemoveContainer" containerID="db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.419541 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkt8d" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.449179 4990 scope.go:117] "RemoveContainer" containerID="c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.468188 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkt8d"] Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.475851 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkt8d"] Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.483342 4990 scope.go:117] "RemoveContainer" containerID="a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.538781 4990 scope.go:117] "RemoveContainer" containerID="db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d" Oct 03 11:26:58 crc kubenswrapper[4990]: E1003 11:26:58.541495 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d\": container with ID starting with db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d not found: ID does not exist" containerID="db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.541542 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d"} err="failed to get container status \"db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d\": rpc error: code = NotFound desc = could not find container \"db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d\": container with ID starting with db228c5c1457196fa52b9db963f0d3e1a36cd19ef300781b550f725c7673db7d not found: ID does not exist" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.541573 4990 scope.go:117] "RemoveContainer" containerID="c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75" Oct 03 11:26:58 crc kubenswrapper[4990]: E1003 11:26:58.541863 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75\": container with ID starting with c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75 not found: ID does not exist" containerID="c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.541891 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75"} err="failed to get container status \"c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75\": rpc error: code = NotFound desc = could not find container \"c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75\": container with ID starting with c0b829958cc2265f0432eafd2671651e83f40f80f012887ffdcddb92c7840f75 not found: ID does not exist" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.541911 4990 scope.go:117] "RemoveContainer" containerID="a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8" Oct 03 11:26:58 crc kubenswrapper[4990]: E1003 11:26:58.542195 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8\": container with ID starting with a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8 not found: ID does not exist" containerID="a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.542223 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8"} err="failed to get container status \"a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8\": rpc error: code = NotFound desc = could not find container \"a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8\": container with ID starting with a9ab3232a137bcb6a7442dc67b988303fb30dff712ea1157fe8626d8f56517b8 not found: ID does not exist" Oct 03 11:26:58 crc kubenswrapper[4990]: I1003 11:26:58.896604 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" path="/var/lib/kubelet/pods/90bd1b20-c76e-4517-abd0-525f7666c02b/volumes" Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.304063 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.304879 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.305276 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.307232 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ede822eb416de1cd2298678efb3de56140b965041f9c6d5dad16b6ae484057a3"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.307353 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://ede822eb416de1cd2298678efb3de56140b965041f9c6d5dad16b6ae484057a3" gracePeriod=600 Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.709538 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="ede822eb416de1cd2298678efb3de56140b965041f9c6d5dad16b6ae484057a3" exitCode=0 Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.709577 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"ede822eb416de1cd2298678efb3de56140b965041f9c6d5dad16b6ae484057a3"} Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.709979 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1"} Oct 03 11:27:25 crc kubenswrapper[4990]: I1003 11:27:25.710002 4990 scope.go:117] "RemoveContainer" containerID="926b85ee55f3d892971df04e4e4ba04d0f884e3f92f5a5480f96f8d9954e2e16" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.519151 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sldtb"] Oct 03 11:27:29 crc kubenswrapper[4990]: E1003 11:27:29.520231 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="registry-server" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.520250 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="registry-server" Oct 03 11:27:29 crc kubenswrapper[4990]: E1003 11:27:29.520272 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="extract-utilities" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.520279 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="extract-utilities" Oct 03 11:27:29 crc kubenswrapper[4990]: E1003 11:27:29.520322 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="extract-content" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.520331 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="extract-content" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.520618 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="90bd1b20-c76e-4517-abd0-525f7666c02b" containerName="registry-server" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.522376 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.537250 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sldtb"] Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.561686 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfjcx\" (UniqueName: \"kubernetes.io/projected/be990cfe-941f-4652-96a2-fc5f0036c732-kube-api-access-jfjcx\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.561796 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-utilities\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.561881 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-catalog-content\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.663790 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-utilities\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.664350 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-utilities\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.664373 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-catalog-content\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.664633 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfjcx\" (UniqueName: \"kubernetes.io/projected/be990cfe-941f-4652-96a2-fc5f0036c732-kube-api-access-jfjcx\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.664879 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-catalog-content\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.687581 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfjcx\" (UniqueName: \"kubernetes.io/projected/be990cfe-941f-4652-96a2-fc5f0036c732-kube-api-access-jfjcx\") pod \"redhat-marketplace-sldtb\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:29 crc kubenswrapper[4990]: I1003 11:27:29.845787 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:30 crc kubenswrapper[4990]: I1003 11:27:30.369081 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sldtb"] Oct 03 11:27:30 crc kubenswrapper[4990]: I1003 11:27:30.786019 4990 generic.go:334] "Generic (PLEG): container finished" podID="be990cfe-941f-4652-96a2-fc5f0036c732" containerID="73bafd3b7682724441920ca87940a84e101be37e32c043a1c7a3efac5c4c9423" exitCode=0 Oct 03 11:27:30 crc kubenswrapper[4990]: I1003 11:27:30.786152 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sldtb" event={"ID":"be990cfe-941f-4652-96a2-fc5f0036c732","Type":"ContainerDied","Data":"73bafd3b7682724441920ca87940a84e101be37e32c043a1c7a3efac5c4c9423"} Oct 03 11:27:30 crc kubenswrapper[4990]: I1003 11:27:30.786470 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sldtb" event={"ID":"be990cfe-941f-4652-96a2-fc5f0036c732","Type":"ContainerStarted","Data":"defb2f3dfe71512ff21a6c5a945a2f08d7ed24255d46aafe3649b4f647b61663"} Oct 03 11:27:31 crc kubenswrapper[4990]: I1003 11:27:31.798209 4990 generic.go:334] "Generic (PLEG): container finished" podID="be990cfe-941f-4652-96a2-fc5f0036c732" containerID="7a461e51a1b0ffa6cbad4dfe99476fae7c4ad49927d08a0f53483fb69b7440a3" exitCode=0 Oct 03 11:27:31 crc kubenswrapper[4990]: I1003 11:27:31.798279 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sldtb" event={"ID":"be990cfe-941f-4652-96a2-fc5f0036c732","Type":"ContainerDied","Data":"7a461e51a1b0ffa6cbad4dfe99476fae7c4ad49927d08a0f53483fb69b7440a3"} Oct 03 11:27:32 crc kubenswrapper[4990]: I1003 11:27:32.810162 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sldtb" event={"ID":"be990cfe-941f-4652-96a2-fc5f0036c732","Type":"ContainerStarted","Data":"5edc03fa0f87386afa6427321d18977755f25d8e4a5e564467630bb955bfffdd"} Oct 03 11:27:32 crc kubenswrapper[4990]: I1003 11:27:32.841770 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sldtb" podStartSLOduration=2.30088584 podStartE2EDuration="3.841747945s" podCreationTimestamp="2025-10-03 11:27:29 +0000 UTC" firstStartedPulling="2025-10-03 11:27:30.78875936 +0000 UTC m=+6232.585391257" lastFinishedPulling="2025-10-03 11:27:32.329621475 +0000 UTC m=+6234.126253362" observedRunningTime="2025-10-03 11:27:32.832091115 +0000 UTC m=+6234.628722982" watchObservedRunningTime="2025-10-03 11:27:32.841747945 +0000 UTC m=+6234.638379812" Oct 03 11:27:38 crc kubenswrapper[4990]: I1003 11:27:38.995500 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-95f65547c-4qtl9"] Oct 03 11:27:38 crc kubenswrapper[4990]: I1003 11:27:38.997755 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:38 crc kubenswrapper[4990]: I1003 11:27:38.999775 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.000117 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-zg72g" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.000249 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.000362 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.012036 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-95f65547c-4qtl9"] Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.086101 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.086388 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" containerName="glance-log" containerID="cri-o://2cacbb7f9df592a55f01ff914f7723e441ad894fabfd848c9f8b1b934c47d48b" gracePeriod=30 Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.086910 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" containerName="glance-httpd" containerID="cri-o://2b654eb5a2db01992f65019e694c6843e211ff9263b940caffdb0e23a8d6e865" gracePeriod=30 Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.103544 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b8dfc7857-h4j96"] Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.105235 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.115989 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b8dfc7857-h4j96"] Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.166604 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e67add6b-68dc-404e-be5b-6382a3b88660-horizon-secret-key\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.166680 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e67add6b-68dc-404e-be5b-6382a3b88660-logs\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.166782 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2z4\" (UniqueName: \"kubernetes.io/projected/e67add6b-68dc-404e-be5b-6382a3b88660-kube-api-access-kp2z4\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.166810 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-config-data\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.166838 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-scripts\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.179404 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.179656 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerName="glance-log" containerID="cri-o://7e546ca8c8fe3b8c85be0de3d8de112917e66d03189481086058603db75386ba" gracePeriod=30 Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.179791 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerName="glance-httpd" containerID="cri-o://2238d8b26788c9ac7065f48d5b95abea3b7acec0b17d161d16a98893ee2488f8" gracePeriod=30 Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.268774 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec642e6-9095-4e86-a0b0-3f0910254d86-logs\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.268825 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e67add6b-68dc-404e-be5b-6382a3b88660-horizon-secret-key\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.268858 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-config-data\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.268886 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmbd\" (UniqueName: \"kubernetes.io/projected/cec642e6-9095-4e86-a0b0-3f0910254d86-kube-api-access-qbmbd\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.268909 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e67add6b-68dc-404e-be5b-6382a3b88660-logs\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.268958 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cec642e6-9095-4e86-a0b0-3f0910254d86-horizon-secret-key\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.269015 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-scripts\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.269043 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-config-data\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.269059 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp2z4\" (UniqueName: \"kubernetes.io/projected/e67add6b-68dc-404e-be5b-6382a3b88660-kube-api-access-kp2z4\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.269082 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-scripts\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.269425 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e67add6b-68dc-404e-be5b-6382a3b88660-logs\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.269758 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-scripts\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.270772 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-config-data\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.273627 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e67add6b-68dc-404e-be5b-6382a3b88660-horizon-secret-key\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.286614 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp2z4\" (UniqueName: \"kubernetes.io/projected/e67add6b-68dc-404e-be5b-6382a3b88660-kube-api-access-kp2z4\") pod \"horizon-95f65547c-4qtl9\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.342026 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.370995 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-scripts\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.371092 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec642e6-9095-4e86-a0b0-3f0910254d86-logs\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.371138 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-config-data\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.371169 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmbd\" (UniqueName: \"kubernetes.io/projected/cec642e6-9095-4e86-a0b0-3f0910254d86-kube-api-access-qbmbd\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.371235 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cec642e6-9095-4e86-a0b0-3f0910254d86-horizon-secret-key\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.371654 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec642e6-9095-4e86-a0b0-3f0910254d86-logs\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.372366 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-scripts\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.372688 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-config-data\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.374564 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cec642e6-9095-4e86-a0b0-3f0910254d86-horizon-secret-key\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.396425 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmbd\" (UniqueName: \"kubernetes.io/projected/cec642e6-9095-4e86-a0b0-3f0910254d86-kube-api-access-qbmbd\") pod \"horizon-b8dfc7857-h4j96\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.465132 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.821196 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-95f65547c-4qtl9"] Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.846423 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.847334 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.880796 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95f65547c-4qtl9" event={"ID":"e67add6b-68dc-404e-be5b-6382a3b88660","Type":"ContainerStarted","Data":"15536b5dcf5bb06a6165e8fae702bdc3eb3fb2c9d8743096f3f9e44da46b450b"} Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.883775 4990 generic.go:334] "Generic (PLEG): container finished" podID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerID="7e546ca8c8fe3b8c85be0de3d8de112917e66d03189481086058603db75386ba" exitCode=143 Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.883832 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660ed9e6-0b82-4108-a6f4-d348b90db8d2","Type":"ContainerDied","Data":"7e546ca8c8fe3b8c85be0de3d8de112917e66d03189481086058603db75386ba"} Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.887146 4990 generic.go:334] "Generic (PLEG): container finished" podID="56216570-ed46-42c2-98a7-c6986edd89e9" containerID="2cacbb7f9df592a55f01ff914f7723e441ad894fabfd848c9f8b1b934c47d48b" exitCode=143 Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.887411 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56216570-ed46-42c2-98a7-c6986edd89e9","Type":"ContainerDied","Data":"2cacbb7f9df592a55f01ff914f7723e441ad894fabfd848c9f8b1b934c47d48b"} Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.893872 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:39 crc kubenswrapper[4990]: I1003 11:27:39.978074 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b8dfc7857-h4j96"] Oct 03 11:27:40 crc kubenswrapper[4990]: I1003 11:27:40.904162 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8dfc7857-h4j96" event={"ID":"cec642e6-9095-4e86-a0b0-3f0910254d86","Type":"ContainerStarted","Data":"fc81a9e1fd5e3227e1c0bb8f0889d893ab592290a418d4addf7505b17e376882"} Oct 03 11:27:40 crc kubenswrapper[4990]: I1003 11:27:40.965405 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.011962 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sldtb"] Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.191731 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-95f65547c-4qtl9"] Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.222865 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-757947bb9d-q7gjf"] Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.227874 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.230635 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.237239 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-757947bb9d-q7gjf"] Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.274450 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b8dfc7857-h4j96"] Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.298709 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c5fb6fc74-sw79s"] Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.300404 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.316925 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c5fb6fc74-sw79s"] Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.420399 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-config-data\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.420554 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/661bb646-9a34-4caf-b571-aa48afc768ca-logs\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.420618 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-secret-key\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.420645 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-logs\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.420718 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-scripts\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.420817 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-combined-ca-bundle\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.420971 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-scripts\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.421034 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-tls-certs\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.421058 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-config-data\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.421178 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-combined-ca-bundle\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.421205 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-secret-key\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.421285 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjsgq\" (UniqueName: \"kubernetes.io/projected/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-kube-api-access-rjsgq\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.421344 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmhzl\" (UniqueName: \"kubernetes.io/projected/661bb646-9a34-4caf-b571-aa48afc768ca-kube-api-access-pmhzl\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.421373 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-tls-certs\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523254 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-scripts\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523350 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-tls-certs\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523401 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-config-data\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523454 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-combined-ca-bundle\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523498 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-secret-key\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523561 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjsgq\" (UniqueName: \"kubernetes.io/projected/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-kube-api-access-rjsgq\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523583 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmhzl\" (UniqueName: \"kubernetes.io/projected/661bb646-9a34-4caf-b571-aa48afc768ca-kube-api-access-pmhzl\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523631 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-tls-certs\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523661 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-config-data\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523720 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/661bb646-9a34-4caf-b571-aa48afc768ca-logs\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523772 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-secret-key\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523798 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-logs\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523846 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-scripts\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.523892 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-combined-ca-bundle\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.524421 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/661bb646-9a34-4caf-b571-aa48afc768ca-logs\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.525082 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-scripts\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.525419 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-logs\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.526018 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-config-data\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.526670 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-scripts\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.528900 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-config-data\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.532561 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-tls-certs\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.532978 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-secret-key\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.533546 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-secret-key\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.534279 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-tls-certs\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.534372 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-combined-ca-bundle\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.539284 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-combined-ca-bundle\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.542625 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjsgq\" (UniqueName: \"kubernetes.io/projected/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-kube-api-access-rjsgq\") pod \"horizon-6c5fb6fc74-sw79s\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.542656 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmhzl\" (UniqueName: \"kubernetes.io/projected/661bb646-9a34-4caf-b571-aa48afc768ca-kube-api-access-pmhzl\") pod \"horizon-757947bb9d-q7gjf\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.566577 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:41 crc kubenswrapper[4990]: I1003 11:27:41.645207 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.068332 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-757947bb9d-q7gjf"] Oct 03 11:27:42 crc kubenswrapper[4990]: W1003 11:27:42.076685 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod661bb646_9a34_4caf_b571_aa48afc768ca.slice/crio-a15afcbcc4301bccb8f16ba6526da3a6a5d6fbd84c5a0e8ae8e82e4e5e358ec0 WatchSource:0}: Error finding container a15afcbcc4301bccb8f16ba6526da3a6a5d6fbd84c5a0e8ae8e82e4e5e358ec0: Status 404 returned error can't find the container with id a15afcbcc4301bccb8f16ba6526da3a6a5d6fbd84c5a0e8ae8e82e4e5e358ec0 Oct 03 11:27:42 crc kubenswrapper[4990]: W1003 11:27:42.168340 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf08ca166_3f80_40f9_9c8d_d4bf3ca2df9d.slice/crio-a43b04250082e619513d75d2e0bc2037466e7f98179cf728b1537124f580cb06 WatchSource:0}: Error finding container a43b04250082e619513d75d2e0bc2037466e7f98179cf728b1537124f580cb06: Status 404 returned error can't find the container with id a43b04250082e619513d75d2e0bc2037466e7f98179cf728b1537124f580cb06 Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.169314 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c5fb6fc74-sw79s"] Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.934595 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fb6fc74-sw79s" event={"ID":"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d","Type":"ContainerStarted","Data":"a43b04250082e619513d75d2e0bc2037466e7f98179cf728b1537124f580cb06"} Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.936746 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757947bb9d-q7gjf" event={"ID":"661bb646-9a34-4caf-b571-aa48afc768ca","Type":"ContainerStarted","Data":"a15afcbcc4301bccb8f16ba6526da3a6a5d6fbd84c5a0e8ae8e82e4e5e358ec0"} Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.940371 4990 generic.go:334] "Generic (PLEG): container finished" podID="56216570-ed46-42c2-98a7-c6986edd89e9" containerID="2b654eb5a2db01992f65019e694c6843e211ff9263b940caffdb0e23a8d6e865" exitCode=0 Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.940448 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56216570-ed46-42c2-98a7-c6986edd89e9","Type":"ContainerDied","Data":"2b654eb5a2db01992f65019e694c6843e211ff9263b940caffdb0e23a8d6e865"} Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.942234 4990 generic.go:334] "Generic (PLEG): container finished" podID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerID="2238d8b26788c9ac7065f48d5b95abea3b7acec0b17d161d16a98893ee2488f8" exitCode=0 Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.942251 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660ed9e6-0b82-4108-a6f4-d348b90db8d2","Type":"ContainerDied","Data":"2238d8b26788c9ac7065f48d5b95abea3b7acec0b17d161d16a98893ee2488f8"} Oct 03 11:27:42 crc kubenswrapper[4990]: I1003 11:27:42.942440 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sldtb" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" containerName="registry-server" containerID="cri-o://5edc03fa0f87386afa6427321d18977755f25d8e4a5e564467630bb955bfffdd" gracePeriod=2 Oct 03 11:27:43 crc kubenswrapper[4990]: I1003 11:27:43.957246 4990 generic.go:334] "Generic (PLEG): container finished" podID="be990cfe-941f-4652-96a2-fc5f0036c732" containerID="5edc03fa0f87386afa6427321d18977755f25d8e4a5e564467630bb955bfffdd" exitCode=0 Oct 03 11:27:43 crc kubenswrapper[4990]: I1003 11:27:43.957308 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sldtb" event={"ID":"be990cfe-941f-4652-96a2-fc5f0036c732","Type":"ContainerDied","Data":"5edc03fa0f87386afa6427321d18977755f25d8e4a5e564467630bb955bfffdd"} Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.414991 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.422295 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.562879 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-httpd-run\") pod \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.562916 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-combined-ca-bundle\") pod \"56216570-ed46-42c2-98a7-c6986edd89e9\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.562968 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-config-data\") pod \"56216570-ed46-42c2-98a7-c6986edd89e9\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563049 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-config-data\") pod \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563068 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m6mj\" (UniqueName: \"kubernetes.io/projected/660ed9e6-0b82-4108-a6f4-d348b90db8d2-kube-api-access-8m6mj\") pod \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563146 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-public-tls-certs\") pod \"56216570-ed46-42c2-98a7-c6986edd89e9\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563165 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-httpd-run\") pod \"56216570-ed46-42c2-98a7-c6986edd89e9\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563190 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-logs\") pod \"56216570-ed46-42c2-98a7-c6986edd89e9\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563217 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-scripts\") pod \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563261 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-internal-tls-certs\") pod \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563299 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-logs\") pod \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563328 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-combined-ca-bundle\") pod \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\" (UID: \"660ed9e6-0b82-4108-a6f4-d348b90db8d2\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563361 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87gbx\" (UniqueName: \"kubernetes.io/projected/56216570-ed46-42c2-98a7-c6986edd89e9-kube-api-access-87gbx\") pod \"56216570-ed46-42c2-98a7-c6986edd89e9\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.563400 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-scripts\") pod \"56216570-ed46-42c2-98a7-c6986edd89e9\" (UID: \"56216570-ed46-42c2-98a7-c6986edd89e9\") " Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.564173 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "56216570-ed46-42c2-98a7-c6986edd89e9" (UID: "56216570-ed46-42c2-98a7-c6986edd89e9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.564555 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "660ed9e6-0b82-4108-a6f4-d348b90db8d2" (UID: "660ed9e6-0b82-4108-a6f4-d348b90db8d2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.566884 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-logs" (OuterVolumeSpecName: "logs") pod "56216570-ed46-42c2-98a7-c6986edd89e9" (UID: "56216570-ed46-42c2-98a7-c6986edd89e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.567832 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-logs" (OuterVolumeSpecName: "logs") pod "660ed9e6-0b82-4108-a6f4-d348b90db8d2" (UID: "660ed9e6-0b82-4108-a6f4-d348b90db8d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.575163 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-scripts" (OuterVolumeSpecName: "scripts") pod "56216570-ed46-42c2-98a7-c6986edd89e9" (UID: "56216570-ed46-42c2-98a7-c6986edd89e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.575578 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660ed9e6-0b82-4108-a6f4-d348b90db8d2-kube-api-access-8m6mj" (OuterVolumeSpecName: "kube-api-access-8m6mj") pod "660ed9e6-0b82-4108-a6f4-d348b90db8d2" (UID: "660ed9e6-0b82-4108-a6f4-d348b90db8d2"). InnerVolumeSpecName "kube-api-access-8m6mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.577117 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56216570-ed46-42c2-98a7-c6986edd89e9-kube-api-access-87gbx" (OuterVolumeSpecName: "kube-api-access-87gbx") pod "56216570-ed46-42c2-98a7-c6986edd89e9" (UID: "56216570-ed46-42c2-98a7-c6986edd89e9"). InnerVolumeSpecName "kube-api-access-87gbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.589723 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-scripts" (OuterVolumeSpecName: "scripts") pod "660ed9e6-0b82-4108-a6f4-d348b90db8d2" (UID: "660ed9e6-0b82-4108-a6f4-d348b90db8d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.658749 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56216570-ed46-42c2-98a7-c6986edd89e9" (UID: "56216570-ed46-42c2-98a7-c6986edd89e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666525 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m6mj\" (UniqueName: \"kubernetes.io/projected/660ed9e6-0b82-4108-a6f4-d348b90db8d2-kube-api-access-8m6mj\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666557 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666568 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56216570-ed46-42c2-98a7-c6986edd89e9-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666579 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666587 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666620 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87gbx\" (UniqueName: \"kubernetes.io/projected/56216570-ed46-42c2-98a7-c6986edd89e9-kube-api-access-87gbx\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666630 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666638 4990 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/660ed9e6-0b82-4108-a6f4-d348b90db8d2-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.666646 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.679441 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "660ed9e6-0b82-4108-a6f4-d348b90db8d2" (UID: "660ed9e6-0b82-4108-a6f4-d348b90db8d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.697251 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "660ed9e6-0b82-4108-a6f4-d348b90db8d2" (UID: "660ed9e6-0b82-4108-a6f4-d348b90db8d2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.702637 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-config-data" (OuterVolumeSpecName: "config-data") pod "660ed9e6-0b82-4108-a6f4-d348b90db8d2" (UID: "660ed9e6-0b82-4108-a6f4-d348b90db8d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.727148 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-config-data" (OuterVolumeSpecName: "config-data") pod "56216570-ed46-42c2-98a7-c6986edd89e9" (UID: "56216570-ed46-42c2-98a7-c6986edd89e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.729461 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "56216570-ed46-42c2-98a7-c6986edd89e9" (UID: "56216570-ed46-42c2-98a7-c6986edd89e9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.768419 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.768458 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.768469 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.768477 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660ed9e6-0b82-4108-a6f4-d348b90db8d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.768486 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56216570-ed46-42c2-98a7-c6986edd89e9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:47 crc kubenswrapper[4990]: I1003 11:27:47.894037 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.012986 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56216570-ed46-42c2-98a7-c6986edd89e9","Type":"ContainerDied","Data":"1e4bd7dc531c10851c16973aea7aa230d211b8816ddb461d40f2d08db4b2fda9"} Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.013047 4990 scope.go:117] "RemoveContainer" containerID="2b654eb5a2db01992f65019e694c6843e211ff9263b940caffdb0e23a8d6e865" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.013198 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.021687 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"660ed9e6-0b82-4108-a6f4-d348b90db8d2","Type":"ContainerDied","Data":"ee932a488f47fdbe40824db0692961e9a86dec9dd1588b72b2229dae73773c94"} Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.021694 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.026294 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sldtb" event={"ID":"be990cfe-941f-4652-96a2-fc5f0036c732","Type":"ContainerDied","Data":"defb2f3dfe71512ff21a6c5a945a2f08d7ed24255d46aafe3649b4f647b61663"} Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.026466 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sldtb" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.051631 4990 scope.go:117] "RemoveContainer" containerID="2cacbb7f9df592a55f01ff914f7723e441ad894fabfd848c9f8b1b934c47d48b" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.064671 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.073635 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-catalog-content\") pod \"be990cfe-941f-4652-96a2-fc5f0036c732\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.073865 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-utilities\") pod \"be990cfe-941f-4652-96a2-fc5f0036c732\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.073952 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfjcx\" (UniqueName: \"kubernetes.io/projected/be990cfe-941f-4652-96a2-fc5f0036c732-kube-api-access-jfjcx\") pod \"be990cfe-941f-4652-96a2-fc5f0036c732\" (UID: \"be990cfe-941f-4652-96a2-fc5f0036c732\") " Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.075403 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-utilities" (OuterVolumeSpecName: "utilities") pod "be990cfe-941f-4652-96a2-fc5f0036c732" (UID: "be990cfe-941f-4652-96a2-fc5f0036c732"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.082019 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.088179 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be990cfe-941f-4652-96a2-fc5f0036c732-kube-api-access-jfjcx" (OuterVolumeSpecName: "kube-api-access-jfjcx") pod "be990cfe-941f-4652-96a2-fc5f0036c732" (UID: "be990cfe-941f-4652-96a2-fc5f0036c732"). InnerVolumeSpecName "kube-api-access-jfjcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.092804 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be990cfe-941f-4652-96a2-fc5f0036c732" (UID: "be990cfe-941f-4652-96a2-fc5f0036c732"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.092893 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.097065 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.107971 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:27:48 crc kubenswrapper[4990]: E1003 11:27:48.108704 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerName="glance-httpd" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.108789 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerName="glance-httpd" Oct 03 11:27:48 crc kubenswrapper[4990]: E1003 11:27:48.108867 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerName="glance-log" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.108948 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerName="glance-log" Oct 03 11:27:48 crc kubenswrapper[4990]: E1003 11:27:48.109014 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" containerName="glance-httpd" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.109084 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" containerName="glance-httpd" Oct 03 11:27:48 crc kubenswrapper[4990]: E1003 11:27:48.109140 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" containerName="extract-utilities" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.109188 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" containerName="extract-utilities" Oct 03 11:27:48 crc kubenswrapper[4990]: E1003 11:27:48.109244 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" containerName="registry-server" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.109289 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" containerName="registry-server" Oct 03 11:27:48 crc kubenswrapper[4990]: E1003 11:27:48.109352 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" containerName="glance-log" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.109418 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" containerName="glance-log" Oct 03 11:27:48 crc kubenswrapper[4990]: E1003 11:27:48.109484 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" containerName="extract-content" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.109560 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" containerName="extract-content" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.109899 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerName="glance-log" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.109980 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" containerName="registry-server" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.110048 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" containerName="glance-httpd" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.110135 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" containerName="glance-log" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.110209 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" containerName="glance-httpd" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.111493 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.119998 4990 scope.go:117] "RemoveContainer" containerID="2238d8b26788c9ac7065f48d5b95abea3b7acec0b17d161d16a98893ee2488f8" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.120611 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.120642 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.120874 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.120793 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-whwcz" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.137840 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.140097 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.145434 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.152163 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.160019 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.170156 4990 scope.go:117] "RemoveContainer" containerID="7e546ca8c8fe3b8c85be0de3d8de112917e66d03189481086058603db75386ba" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.173379 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.196984 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.197042 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be990cfe-941f-4652-96a2-fc5f0036c732-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.197058 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfjcx\" (UniqueName: \"kubernetes.io/projected/be990cfe-941f-4652-96a2-fc5f0036c732-kube-api-access-jfjcx\") on node \"crc\" DevicePath \"\"" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.220620 4990 scope.go:117] "RemoveContainer" containerID="5edc03fa0f87386afa6427321d18977755f25d8e4a5e564467630bb955bfffdd" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.289324 4990 scope.go:117] "RemoveContainer" containerID="7a461e51a1b0ffa6cbad4dfe99476fae7c4ad49927d08a0f53483fb69b7440a3" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.299642 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.299726 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.299773 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac650573-e608-46d6-852b-d31c897b95e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.299821 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szp28\" (UniqueName: \"kubernetes.io/projected/ac650573-e608-46d6-852b-d31c897b95e1-kube-api-access-szp28\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.299894 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbr5z\" (UniqueName: \"kubernetes.io/projected/d04849bc-a827-49d5-8ccc-901073a459cf-kube-api-access-kbr5z\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.299953 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.299985 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.300012 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d04849bc-a827-49d5-8ccc-901073a459cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.300034 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.300060 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d04849bc-a827-49d5-8ccc-901073a459cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.300130 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac650573-e608-46d6-852b-d31c897b95e1-logs\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.300174 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.300200 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.300262 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402242 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402297 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402320 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d04849bc-a827-49d5-8ccc-901073a459cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402341 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402359 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d04849bc-a827-49d5-8ccc-901073a459cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402415 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac650573-e608-46d6-852b-d31c897b95e1-logs\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402442 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402466 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402484 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402546 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402578 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402611 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac650573-e608-46d6-852b-d31c897b95e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402655 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szp28\" (UniqueName: \"kubernetes.io/projected/ac650573-e608-46d6-852b-d31c897b95e1-kube-api-access-szp28\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.402696 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbr5z\" (UniqueName: \"kubernetes.io/projected/d04849bc-a827-49d5-8ccc-901073a459cf-kube-api-access-kbr5z\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.403119 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d04849bc-a827-49d5-8ccc-901073a459cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.403145 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d04849bc-a827-49d5-8ccc-901073a459cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.403145 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac650573-e608-46d6-852b-d31c897b95e1-logs\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.403477 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac650573-e608-46d6-852b-d31c897b95e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.408669 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.410465 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.410877 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.413191 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.413204 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.416243 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04849bc-a827-49d5-8ccc-901073a459cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.416857 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.421374 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac650573-e608-46d6-852b-d31c897b95e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.421984 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szp28\" (UniqueName: \"kubernetes.io/projected/ac650573-e608-46d6-852b-d31c897b95e1-kube-api-access-szp28\") pod \"glance-default-external-api-0\" (UID: \"ac650573-e608-46d6-852b-d31c897b95e1\") " pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.424078 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbr5z\" (UniqueName: \"kubernetes.io/projected/d04849bc-a827-49d5-8ccc-901073a459cf-kube-api-access-kbr5z\") pod \"glance-default-internal-api-0\" (UID: \"d04849bc-a827-49d5-8ccc-901073a459cf\") " pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.463696 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.495057 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.643342 4990 scope.go:117] "RemoveContainer" containerID="73bafd3b7682724441920ca87940a84e101be37e32c043a1c7a3efac5c4c9423" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.722984 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sldtb"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.748282 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sldtb"] Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.900234 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56216570-ed46-42c2-98a7-c6986edd89e9" path="/var/lib/kubelet/pods/56216570-ed46-42c2-98a7-c6986edd89e9/volumes" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.901272 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660ed9e6-0b82-4108-a6f4-d348b90db8d2" path="/var/lib/kubelet/pods/660ed9e6-0b82-4108-a6f4-d348b90db8d2/volumes" Oct 03 11:27:48 crc kubenswrapper[4990]: I1003 11:27:48.902089 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be990cfe-941f-4652-96a2-fc5f0036c732" path="/var/lib/kubelet/pods/be990cfe-941f-4652-96a2-fc5f0036c732/volumes" Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.044012 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95f65547c-4qtl9" event={"ID":"e67add6b-68dc-404e-be5b-6382a3b88660","Type":"ContainerStarted","Data":"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b"} Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.044064 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95f65547c-4qtl9" event={"ID":"e67add6b-68dc-404e-be5b-6382a3b88660","Type":"ContainerStarted","Data":"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252"} Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.044126 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-95f65547c-4qtl9" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" containerName="horizon-log" containerID="cri-o://dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252" gracePeriod=30 Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.044201 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-95f65547c-4qtl9" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" containerName="horizon" containerID="cri-o://f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b" gracePeriod=30 Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.057827 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fb6fc74-sw79s" event={"ID":"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d","Type":"ContainerStarted","Data":"e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed"} Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.057866 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fb6fc74-sw79s" event={"ID":"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d","Type":"ContainerStarted","Data":"9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98"} Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.065457 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8dfc7857-h4j96" event={"ID":"cec642e6-9095-4e86-a0b0-3f0910254d86","Type":"ContainerStarted","Data":"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8"} Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.065528 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8dfc7857-h4j96" event={"ID":"cec642e6-9095-4e86-a0b0-3f0910254d86","Type":"ContainerStarted","Data":"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445"} Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.065669 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b8dfc7857-h4j96" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerName="horizon-log" containerID="cri-o://db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445" gracePeriod=30 Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.065968 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b8dfc7857-h4j96" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerName="horizon" containerID="cri-o://c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8" gracePeriod=30 Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.069376 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757947bb9d-q7gjf" event={"ID":"661bb646-9a34-4caf-b571-aa48afc768ca","Type":"ContainerStarted","Data":"7ed529597a201c12939e68f694f31732844059cdb79f16e968a19d01e012126d"} Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.069415 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757947bb9d-q7gjf" event={"ID":"661bb646-9a34-4caf-b571-aa48afc768ca","Type":"ContainerStarted","Data":"98f2532ad8c86aaf6e829dc714e8c0e57927f18f5be486acb2fccbcfc7fffa60"} Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.104831 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-95f65547c-4qtl9" podStartSLOduration=3.4061436130000002 podStartE2EDuration="11.104810252s" podCreationTimestamp="2025-10-03 11:27:38 +0000 UTC" firstStartedPulling="2025-10-03 11:27:39.823391978 +0000 UTC m=+6241.620023825" lastFinishedPulling="2025-10-03 11:27:47.522058597 +0000 UTC m=+6249.318690464" observedRunningTime="2025-10-03 11:27:49.082750513 +0000 UTC m=+6250.879382380" watchObservedRunningTime="2025-10-03 11:27:49.104810252 +0000 UTC m=+6250.901442119" Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.107673 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-757947bb9d-q7gjf" podStartSLOduration=2.777600442 podStartE2EDuration="8.107658006s" podCreationTimestamp="2025-10-03 11:27:41 +0000 UTC" firstStartedPulling="2025-10-03 11:27:42.079283526 +0000 UTC m=+6243.875915383" lastFinishedPulling="2025-10-03 11:27:47.40934109 +0000 UTC m=+6249.205972947" observedRunningTime="2025-10-03 11:27:49.100971973 +0000 UTC m=+6250.897603840" watchObservedRunningTime="2025-10-03 11:27:49.107658006 +0000 UTC m=+6250.904289873" Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.134628 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b8dfc7857-h4j96" podStartSLOduration=2.734443291 podStartE2EDuration="10.134603651s" podCreationTimestamp="2025-10-03 11:27:39 +0000 UTC" firstStartedPulling="2025-10-03 11:27:39.988478076 +0000 UTC m=+6241.785109933" lastFinishedPulling="2025-10-03 11:27:47.388638426 +0000 UTC m=+6249.185270293" observedRunningTime="2025-10-03 11:27:49.122867278 +0000 UTC m=+6250.919499135" watchObservedRunningTime="2025-10-03 11:27:49.134603651 +0000 UTC m=+6250.931235528" Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.141587 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c5fb6fc74-sw79s" podStartSLOduration=2.924146923 podStartE2EDuration="8.141571791s" podCreationTimestamp="2025-10-03 11:27:41 +0000 UTC" firstStartedPulling="2025-10-03 11:27:42.171195577 +0000 UTC m=+6243.967827434" lastFinishedPulling="2025-10-03 11:27:47.388620445 +0000 UTC m=+6249.185252302" observedRunningTime="2025-10-03 11:27:49.140571685 +0000 UTC m=+6250.937203552" watchObservedRunningTime="2025-10-03 11:27:49.141571791 +0000 UTC m=+6250.938203648" Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.178783 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 11:27:49 crc kubenswrapper[4990]: W1003 11:27:49.182127 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac650573_e608_46d6_852b_d31c897b95e1.slice/crio-05f21cdabfda4f8e946f76b490af17c828aad9e47cd7bd15fde53f1d7e28893b WatchSource:0}: Error finding container 05f21cdabfda4f8e946f76b490af17c828aad9e47cd7bd15fde53f1d7e28893b: Status 404 returned error can't find the container with id 05f21cdabfda4f8e946f76b490af17c828aad9e47cd7bd15fde53f1d7e28893b Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.261194 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.342596 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:27:49 crc kubenswrapper[4990]: I1003 11:27:49.465906 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:27:50 crc kubenswrapper[4990]: I1003 11:27:50.186273 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac650573-e608-46d6-852b-d31c897b95e1","Type":"ContainerStarted","Data":"0e2426525c4ef1687598f9f2aa5f493241d8befa1cdd79d44f77b62f238aed83"} Oct 03 11:27:50 crc kubenswrapper[4990]: I1003 11:27:50.186592 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac650573-e608-46d6-852b-d31c897b95e1","Type":"ContainerStarted","Data":"05f21cdabfda4f8e946f76b490af17c828aad9e47cd7bd15fde53f1d7e28893b"} Oct 03 11:27:50 crc kubenswrapper[4990]: I1003 11:27:50.195781 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04849bc-a827-49d5-8ccc-901073a459cf","Type":"ContainerStarted","Data":"fd1129fa9e42148387b733e4bd7f2c89057239f4895095a9deaad7e184052755"} Oct 03 11:27:50 crc kubenswrapper[4990]: I1003 11:27:50.195832 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04849bc-a827-49d5-8ccc-901073a459cf","Type":"ContainerStarted","Data":"962b99aad8deb012a597c509bb8df01cb01a68dc4851066dfa9795183fe84a37"} Oct 03 11:27:50 crc kubenswrapper[4990]: I1003 11:27:50.243677 4990 scope.go:117] "RemoveContainer" containerID="def4c4a1ecfd8764dd90657ae49e91d3ed0c784a2332913c1ffc040c0c2aafc4" Oct 03 11:27:51 crc kubenswrapper[4990]: I1003 11:27:51.206999 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac650573-e608-46d6-852b-d31c897b95e1","Type":"ContainerStarted","Data":"9b2428ea1432db3c4574558c1466e4e4a158fc87358e2e3e4d95173204a81683"} Oct 03 11:27:51 crc kubenswrapper[4990]: I1003 11:27:51.210077 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d04849bc-a827-49d5-8ccc-901073a459cf","Type":"ContainerStarted","Data":"b5cc8082ab564cb421a3ba51e9502269fda0b00d28c78efa02351cd019b963b0"} Oct 03 11:27:51 crc kubenswrapper[4990]: I1003 11:27:51.249086 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.249057291 podStartE2EDuration="3.249057291s" podCreationTimestamp="2025-10-03 11:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:27:51.233031128 +0000 UTC m=+6253.029662985" watchObservedRunningTime="2025-10-03 11:27:51.249057291 +0000 UTC m=+6253.045689178" Oct 03 11:27:51 crc kubenswrapper[4990]: I1003 11:27:51.263955 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.263926245 podStartE2EDuration="3.263926245s" podCreationTimestamp="2025-10-03 11:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:27:51.255269921 +0000 UTC m=+6253.051901808" watchObservedRunningTime="2025-10-03 11:27:51.263926245 +0000 UTC m=+6253.060558132" Oct 03 11:27:51 crc kubenswrapper[4990]: I1003 11:27:51.566936 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:51 crc kubenswrapper[4990]: I1003 11:27:51.566992 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:27:51 crc kubenswrapper[4990]: I1003 11:27:51.646057 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:51 crc kubenswrapper[4990]: I1003 11:27:51.646401 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:27:58 crc kubenswrapper[4990]: I1003 11:27:58.464816 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 11:27:58 crc kubenswrapper[4990]: I1003 11:27:58.467481 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 11:27:58 crc kubenswrapper[4990]: I1003 11:27:58.497041 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:58 crc kubenswrapper[4990]: I1003 11:27:58.497100 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:58 crc kubenswrapper[4990]: I1003 11:27:58.522112 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 11:27:58 crc kubenswrapper[4990]: I1003 11:27:58.537560 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 11:27:58 crc kubenswrapper[4990]: I1003 11:27:58.539330 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:58 crc kubenswrapper[4990]: I1003 11:27:58.559782 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:59 crc kubenswrapper[4990]: I1003 11:27:59.309794 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 11:27:59 crc kubenswrapper[4990]: I1003 11:27:59.309870 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 11:27:59 crc kubenswrapper[4990]: I1003 11:27:59.309885 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 11:27:59 crc kubenswrapper[4990]: I1003 11:27:59.309894 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 11:28:01 crc kubenswrapper[4990]: I1003 11:28:01.405919 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 11:28:01 crc kubenswrapper[4990]: I1003 11:28:01.406615 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 11:28:01 crc kubenswrapper[4990]: I1003 11:28:01.426427 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 11:28:01 crc kubenswrapper[4990]: I1003 11:28:01.426590 4990 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 11:28:01 crc kubenswrapper[4990]: I1003 11:28:01.546372 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 11:28:01 crc kubenswrapper[4990]: I1003 11:28:01.551007 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 11:28:01 crc kubenswrapper[4990]: I1003 11:28:01.569085 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-757947bb9d-q7gjf" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.121:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.121:8443: connect: connection refused" Oct 03 11:28:01 crc kubenswrapper[4990]: I1003 11:28:01.662253 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c5fb6fc74-sw79s" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.122:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.122:8443: connect: connection refused" Oct 03 11:28:13 crc kubenswrapper[4990]: I1003 11:28:13.371099 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:28:13 crc kubenswrapper[4990]: I1003 11:28:13.562048 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:28:15 crc kubenswrapper[4990]: I1003 11:28:15.101408 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:28:15 crc kubenswrapper[4990]: I1003 11:28:15.267221 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:28:15 crc kubenswrapper[4990]: I1003 11:28:15.330824 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-757947bb9d-q7gjf"] Oct 03 11:28:15 crc kubenswrapper[4990]: I1003 11:28:15.464403 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-757947bb9d-q7gjf" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon-log" containerID="cri-o://98f2532ad8c86aaf6e829dc714e8c0e57927f18f5be486acb2fccbcfc7fffa60" gracePeriod=30 Oct 03 11:28:15 crc kubenswrapper[4990]: I1003 11:28:15.464439 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-757947bb9d-q7gjf" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon" containerID="cri-o://7ed529597a201c12939e68f694f31732844059cdb79f16e968a19d01e012126d" gracePeriod=30 Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.473618 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.488338 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.523285 4990 generic.go:334] "Generic (PLEG): container finished" podID="e67add6b-68dc-404e-be5b-6382a3b88660" containerID="f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b" exitCode=137 Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.523317 4990 generic.go:334] "Generic (PLEG): container finished" podID="e67add6b-68dc-404e-be5b-6382a3b88660" containerID="dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252" exitCode=137 Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.523376 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95f65547c-4qtl9" event={"ID":"e67add6b-68dc-404e-be5b-6382a3b88660","Type":"ContainerDied","Data":"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b"} Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.523403 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95f65547c-4qtl9" event={"ID":"e67add6b-68dc-404e-be5b-6382a3b88660","Type":"ContainerDied","Data":"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252"} Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.523415 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95f65547c-4qtl9" event={"ID":"e67add6b-68dc-404e-be5b-6382a3b88660","Type":"ContainerDied","Data":"15536b5dcf5bb06a6165e8fae702bdc3eb3fb2c9d8743096f3f9e44da46b450b"} Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.523430 4990 scope.go:117] "RemoveContainer" containerID="f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.523581 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95f65547c-4qtl9" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.529240 4990 generic.go:334] "Generic (PLEG): container finished" podID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerID="c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8" exitCode=137 Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.529268 4990 generic.go:334] "Generic (PLEG): container finished" podID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerID="db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445" exitCode=137 Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.529333 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8dfc7857-h4j96" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.529440 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8dfc7857-h4j96" event={"ID":"cec642e6-9095-4e86-a0b0-3f0910254d86","Type":"ContainerDied","Data":"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8"} Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.529486 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8dfc7857-h4j96" event={"ID":"cec642e6-9095-4e86-a0b0-3f0910254d86","Type":"ContainerDied","Data":"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445"} Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.529501 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8dfc7857-h4j96" event={"ID":"cec642e6-9095-4e86-a0b0-3f0910254d86","Type":"ContainerDied","Data":"fc81a9e1fd5e3227e1c0bb8f0889d893ab592290a418d4addf7505b17e376882"} Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.533479 4990 generic.go:334] "Generic (PLEG): container finished" podID="661bb646-9a34-4caf-b571-aa48afc768ca" containerID="7ed529597a201c12939e68f694f31732844059cdb79f16e968a19d01e012126d" exitCode=0 Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.533542 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757947bb9d-q7gjf" event={"ID":"661bb646-9a34-4caf-b571-aa48afc768ca","Type":"ContainerDied","Data":"7ed529597a201c12939e68f694f31732844059cdb79f16e968a19d01e012126d"} Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.598353 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cec642e6-9095-4e86-a0b0-3f0910254d86-horizon-secret-key\") pod \"cec642e6-9095-4e86-a0b0-3f0910254d86\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.598470 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec642e6-9095-4e86-a0b0-3f0910254d86-logs\") pod \"cec642e6-9095-4e86-a0b0-3f0910254d86\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.598504 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e67add6b-68dc-404e-be5b-6382a3b88660-horizon-secret-key\") pod \"e67add6b-68dc-404e-be5b-6382a3b88660\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.599036 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp2z4\" (UniqueName: \"kubernetes.io/projected/e67add6b-68dc-404e-be5b-6382a3b88660-kube-api-access-kp2z4\") pod \"e67add6b-68dc-404e-be5b-6382a3b88660\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.599084 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbmbd\" (UniqueName: \"kubernetes.io/projected/cec642e6-9095-4e86-a0b0-3f0910254d86-kube-api-access-qbmbd\") pod \"cec642e6-9095-4e86-a0b0-3f0910254d86\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.599134 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-config-data\") pod \"e67add6b-68dc-404e-be5b-6382a3b88660\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.599167 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-scripts\") pod \"e67add6b-68dc-404e-be5b-6382a3b88660\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.599206 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-config-data\") pod \"cec642e6-9095-4e86-a0b0-3f0910254d86\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.599231 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e67add6b-68dc-404e-be5b-6382a3b88660-logs\") pod \"e67add6b-68dc-404e-be5b-6382a3b88660\" (UID: \"e67add6b-68dc-404e-be5b-6382a3b88660\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.599253 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-scripts\") pod \"cec642e6-9095-4e86-a0b0-3f0910254d86\" (UID: \"cec642e6-9095-4e86-a0b0-3f0910254d86\") " Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.602643 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e67add6b-68dc-404e-be5b-6382a3b88660-logs" (OuterVolumeSpecName: "logs") pod "e67add6b-68dc-404e-be5b-6382a3b88660" (UID: "e67add6b-68dc-404e-be5b-6382a3b88660"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.602953 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec642e6-9095-4e86-a0b0-3f0910254d86-logs" (OuterVolumeSpecName: "logs") pod "cec642e6-9095-4e86-a0b0-3f0910254d86" (UID: "cec642e6-9095-4e86-a0b0-3f0910254d86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.606353 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e67add6b-68dc-404e-be5b-6382a3b88660-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e67add6b-68dc-404e-be5b-6382a3b88660" (UID: "e67add6b-68dc-404e-be5b-6382a3b88660"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.606500 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec642e6-9095-4e86-a0b0-3f0910254d86-kube-api-access-qbmbd" (OuterVolumeSpecName: "kube-api-access-qbmbd") pod "cec642e6-9095-4e86-a0b0-3f0910254d86" (UID: "cec642e6-9095-4e86-a0b0-3f0910254d86"). InnerVolumeSpecName "kube-api-access-qbmbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.607026 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cec642e6-9095-4e86-a0b0-3f0910254d86-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cec642e6-9095-4e86-a0b0-3f0910254d86" (UID: "cec642e6-9095-4e86-a0b0-3f0910254d86"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.620435 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67add6b-68dc-404e-be5b-6382a3b88660-kube-api-access-kp2z4" (OuterVolumeSpecName: "kube-api-access-kp2z4") pod "e67add6b-68dc-404e-be5b-6382a3b88660" (UID: "e67add6b-68dc-404e-be5b-6382a3b88660"). InnerVolumeSpecName "kube-api-access-kp2z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.634841 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-config-data" (OuterVolumeSpecName: "config-data") pod "e67add6b-68dc-404e-be5b-6382a3b88660" (UID: "e67add6b-68dc-404e-be5b-6382a3b88660"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.635780 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-scripts" (OuterVolumeSpecName: "scripts") pod "cec642e6-9095-4e86-a0b0-3f0910254d86" (UID: "cec642e6-9095-4e86-a0b0-3f0910254d86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.636692 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-scripts" (OuterVolumeSpecName: "scripts") pod "e67add6b-68dc-404e-be5b-6382a3b88660" (UID: "e67add6b-68dc-404e-be5b-6382a3b88660"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.640060 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-config-data" (OuterVolumeSpecName: "config-data") pod "cec642e6-9095-4e86-a0b0-3f0910254d86" (UID: "cec642e6-9095-4e86-a0b0-3f0910254d86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702244 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp2z4\" (UniqueName: \"kubernetes.io/projected/e67add6b-68dc-404e-be5b-6382a3b88660-kube-api-access-kp2z4\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702311 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbmbd\" (UniqueName: \"kubernetes.io/projected/cec642e6-9095-4e86-a0b0-3f0910254d86-kube-api-access-qbmbd\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702326 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702341 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e67add6b-68dc-404e-be5b-6382a3b88660-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702400 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702416 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e67add6b-68dc-404e-be5b-6382a3b88660-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702427 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cec642e6-9095-4e86-a0b0-3f0910254d86-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702438 4990 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cec642e6-9095-4e86-a0b0-3f0910254d86-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702473 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec642e6-9095-4e86-a0b0-3f0910254d86-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.702486 4990 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e67add6b-68dc-404e-be5b-6382a3b88660-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.706387 4990 scope.go:117] "RemoveContainer" containerID="dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.751615 4990 scope.go:117] "RemoveContainer" containerID="f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b" Oct 03 11:28:19 crc kubenswrapper[4990]: E1003 11:28:19.752194 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b\": container with ID starting with f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b not found: ID does not exist" containerID="f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.752243 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b"} err="failed to get container status \"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b\": rpc error: code = NotFound desc = could not find container \"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b\": container with ID starting with f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b not found: ID does not exist" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.752271 4990 scope.go:117] "RemoveContainer" containerID="dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252" Oct 03 11:28:19 crc kubenswrapper[4990]: E1003 11:28:19.752762 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252\": container with ID starting with dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252 not found: ID does not exist" containerID="dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.752815 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252"} err="failed to get container status \"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252\": rpc error: code = NotFound desc = could not find container \"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252\": container with ID starting with dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252 not found: ID does not exist" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.752843 4990 scope.go:117] "RemoveContainer" containerID="f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.753186 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b"} err="failed to get container status \"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b\": rpc error: code = NotFound desc = could not find container \"f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b\": container with ID starting with f2463f0ee249249793607cd176e1227bad080417422d1e91aa9d51a4cae1487b not found: ID does not exist" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.753214 4990 scope.go:117] "RemoveContainer" containerID="dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.753746 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252"} err="failed to get container status \"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252\": rpc error: code = NotFound desc = could not find container \"dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252\": container with ID starting with dcd9708cd977aa9d23756f35e3966df88f35a511833cbb23949b5ad4270d8252 not found: ID does not exist" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.753778 4990 scope.go:117] "RemoveContainer" containerID="c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.896977 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b8dfc7857-h4j96"] Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.906890 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b8dfc7857-h4j96"] Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.916689 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-95f65547c-4qtl9"] Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.926110 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-95f65547c-4qtl9"] Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.945986 4990 scope.go:117] "RemoveContainer" containerID="db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.963188 4990 scope.go:117] "RemoveContainer" containerID="c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8" Oct 03 11:28:19 crc kubenswrapper[4990]: E1003 11:28:19.963669 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8\": container with ID starting with c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8 not found: ID does not exist" containerID="c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.963721 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8"} err="failed to get container status \"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8\": rpc error: code = NotFound desc = could not find container \"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8\": container with ID starting with c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8 not found: ID does not exist" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.963753 4990 scope.go:117] "RemoveContainer" containerID="db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445" Oct 03 11:28:19 crc kubenswrapper[4990]: E1003 11:28:19.964160 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445\": container with ID starting with db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445 not found: ID does not exist" containerID="db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.964212 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445"} err="failed to get container status \"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445\": rpc error: code = NotFound desc = could not find container \"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445\": container with ID starting with db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445 not found: ID does not exist" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.964237 4990 scope.go:117] "RemoveContainer" containerID="c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.964540 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8"} err="failed to get container status \"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8\": rpc error: code = NotFound desc = could not find container \"c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8\": container with ID starting with c7b56c0d82c1bbc26b3ee291ef05607879273ce8549253a5170d1822dcf562b8 not found: ID does not exist" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.964583 4990 scope.go:117] "RemoveContainer" containerID="db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445" Oct 03 11:28:19 crc kubenswrapper[4990]: I1003 11:28:19.964878 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445"} err="failed to get container status \"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445\": rpc error: code = NotFound desc = could not find container \"db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445\": container with ID starting with db58848c501d6c289018d3b0ff4c09c6d5b018045a5673480d9d9b7ad2507445 not found: ID does not exist" Oct 03 11:28:20 crc kubenswrapper[4990]: I1003 11:28:20.893646 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" path="/var/lib/kubelet/pods/cec642e6-9095-4e86-a0b0-3f0910254d86/volumes" Oct 03 11:28:20 crc kubenswrapper[4990]: I1003 11:28:20.895041 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" path="/var/lib/kubelet/pods/e67add6b-68dc-404e-be5b-6382a3b88660/volumes" Oct 03 11:28:21 crc kubenswrapper[4990]: I1003 11:28:21.567681 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-757947bb9d-q7gjf" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.121:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.121:8443: connect: connection refused" Oct 03 11:28:31 crc kubenswrapper[4990]: I1003 11:28:31.568364 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-757947bb9d-q7gjf" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.121:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.121:8443: connect: connection refused" Oct 03 11:28:37 crc kubenswrapper[4990]: I1003 11:28:37.052173 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-wxppx"] Oct 03 11:28:37 crc kubenswrapper[4990]: I1003 11:28:37.062876 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-wxppx"] Oct 03 11:28:38 crc kubenswrapper[4990]: I1003 11:28:38.883221 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd54627-c7cf-4726-b0a4-fcca4f641a25" path="/var/lib/kubelet/pods/7dd54627-c7cf-4726-b0a4-fcca4f641a25/volumes" Oct 03 11:28:41 crc kubenswrapper[4990]: I1003 11:28:41.568086 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-757947bb9d-q7gjf" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.121:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.121:8443: connect: connection refused" Oct 03 11:28:41 crc kubenswrapper[4990]: I1003 11:28:41.568890 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:28:45 crc kubenswrapper[4990]: I1003 11:28:45.796778 4990 generic.go:334] "Generic (PLEG): container finished" podID="661bb646-9a34-4caf-b571-aa48afc768ca" containerID="98f2532ad8c86aaf6e829dc714e8c0e57927f18f5be486acb2fccbcfc7fffa60" exitCode=137 Oct 03 11:28:45 crc kubenswrapper[4990]: I1003 11:28:45.796890 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757947bb9d-q7gjf" event={"ID":"661bb646-9a34-4caf-b571-aa48afc768ca","Type":"ContainerDied","Data":"98f2532ad8c86aaf6e829dc714e8c0e57927f18f5be486acb2fccbcfc7fffa60"} Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.239035 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.424532 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-combined-ca-bundle\") pod \"661bb646-9a34-4caf-b571-aa48afc768ca\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.424615 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-secret-key\") pod \"661bb646-9a34-4caf-b571-aa48afc768ca\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.424763 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-scripts\") pod \"661bb646-9a34-4caf-b571-aa48afc768ca\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.424808 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/661bb646-9a34-4caf-b571-aa48afc768ca-logs\") pod \"661bb646-9a34-4caf-b571-aa48afc768ca\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.424869 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-tls-certs\") pod \"661bb646-9a34-4caf-b571-aa48afc768ca\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.424989 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-config-data\") pod \"661bb646-9a34-4caf-b571-aa48afc768ca\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.425056 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmhzl\" (UniqueName: \"kubernetes.io/projected/661bb646-9a34-4caf-b571-aa48afc768ca-kube-api-access-pmhzl\") pod \"661bb646-9a34-4caf-b571-aa48afc768ca\" (UID: \"661bb646-9a34-4caf-b571-aa48afc768ca\") " Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.425704 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/661bb646-9a34-4caf-b571-aa48afc768ca-logs" (OuterVolumeSpecName: "logs") pod "661bb646-9a34-4caf-b571-aa48afc768ca" (UID: "661bb646-9a34-4caf-b571-aa48afc768ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.430264 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/661bb646-9a34-4caf-b571-aa48afc768ca-kube-api-access-pmhzl" (OuterVolumeSpecName: "kube-api-access-pmhzl") pod "661bb646-9a34-4caf-b571-aa48afc768ca" (UID: "661bb646-9a34-4caf-b571-aa48afc768ca"). InnerVolumeSpecName "kube-api-access-pmhzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.436987 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "661bb646-9a34-4caf-b571-aa48afc768ca" (UID: "661bb646-9a34-4caf-b571-aa48afc768ca"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.450486 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-config-data" (OuterVolumeSpecName: "config-data") pod "661bb646-9a34-4caf-b571-aa48afc768ca" (UID: "661bb646-9a34-4caf-b571-aa48afc768ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.450633 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-scripts" (OuterVolumeSpecName: "scripts") pod "661bb646-9a34-4caf-b571-aa48afc768ca" (UID: "661bb646-9a34-4caf-b571-aa48afc768ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.455169 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "661bb646-9a34-4caf-b571-aa48afc768ca" (UID: "661bb646-9a34-4caf-b571-aa48afc768ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.486280 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "661bb646-9a34-4caf-b571-aa48afc768ca" (UID: "661bb646-9a34-4caf-b571-aa48afc768ca"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.527823 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.527864 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/661bb646-9a34-4caf-b571-aa48afc768ca-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.527873 4990 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.527884 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/661bb646-9a34-4caf-b571-aa48afc768ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.527894 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmhzl\" (UniqueName: \"kubernetes.io/projected/661bb646-9a34-4caf-b571-aa48afc768ca-kube-api-access-pmhzl\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.527903 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.527912 4990 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/661bb646-9a34-4caf-b571-aa48afc768ca-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.810295 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-757947bb9d-q7gjf" event={"ID":"661bb646-9a34-4caf-b571-aa48afc768ca","Type":"ContainerDied","Data":"a15afcbcc4301bccb8f16ba6526da3a6a5d6fbd84c5a0e8ae8e82e4e5e358ec0"} Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.810354 4990 scope.go:117] "RemoveContainer" containerID="7ed529597a201c12939e68f694f31732844059cdb79f16e968a19d01e012126d" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.810364 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-757947bb9d-q7gjf" Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.847553 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-757947bb9d-q7gjf"] Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.855186 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-757947bb9d-q7gjf"] Oct 03 11:28:46 crc kubenswrapper[4990]: I1003 11:28:46.883235 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" path="/var/lib/kubelet/pods/661bb646-9a34-4caf-b571-aa48afc768ca/volumes" Oct 03 11:28:47 crc kubenswrapper[4990]: I1003 11:28:47.030294 4990 scope.go:117] "RemoveContainer" containerID="98f2532ad8c86aaf6e829dc714e8c0e57927f18f5be486acb2fccbcfc7fffa60" Oct 03 11:28:47 crc kubenswrapper[4990]: I1003 11:28:47.037803 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f979-account-create-zgdsc"] Oct 03 11:28:47 crc kubenswrapper[4990]: I1003 11:28:47.051264 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f979-account-create-zgdsc"] Oct 03 11:28:48 crc kubenswrapper[4990]: I1003 11:28:48.889025 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a6af39-f0fb-4c78-9119-8a0576429a95" path="/var/lib/kubelet/pods/69a6af39-f0fb-4c78-9119-8a0576429a95/volumes" Oct 03 11:28:50 crc kubenswrapper[4990]: I1003 11:28:50.465403 4990 scope.go:117] "RemoveContainer" containerID="11e77314503db0ca56eb0c2a64e433c770a9a2765dad2e037e3bcba34985d5c9" Oct 03 11:28:50 crc kubenswrapper[4990]: I1003 11:28:50.513269 4990 scope.go:117] "RemoveContainer" containerID="8081059dffbadd1ac4461b04d6fa5f91d310b566dd296a8ae9d0771885fd62e8" Oct 03 11:28:55 crc kubenswrapper[4990]: I1003 11:28:55.033471 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jzsvg"] Oct 03 11:28:55 crc kubenswrapper[4990]: I1003 11:28:55.041132 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jzsvg"] Oct 03 11:28:56 crc kubenswrapper[4990]: I1003 11:28:56.906775 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6893b31d-52e8-4ca8-b8eb-9dd0cff0b520" path="/var/lib/kubelet/pods/6893b31d-52e8-4ca8-b8eb-9dd0cff0b520/volumes" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.031293 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-c86f77854-vcjzs"] Oct 03 11:29:20 crc kubenswrapper[4990]: E1003 11:29:20.032166 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032179 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: E1003 11:29:20.032195 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032202 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: E1003 11:29:20.032217 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032223 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: E1003 11:29:20.032245 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032250 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: E1003 11:29:20.032258 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032264 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: E1003 11:29:20.032282 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032287 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032458 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032476 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="661bb646-9a34-4caf-b571-aa48afc768ca" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032498 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032528 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67add6b-68dc-404e-be5b-6382a3b88660" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032544 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerName="horizon" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.032555 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec642e6-9095-4e86-a0b0-3f0910254d86" containerName="horizon-log" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.033588 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.055990 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c86f77854-vcjzs"] Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.199033 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22cd619-3c49-40f7-98c4-9d3e07566fab-config-data\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.199243 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22cd619-3c49-40f7-98c4-9d3e07566fab-logs\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.199283 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-horizon-secret-key\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.199325 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zclc9\" (UniqueName: \"kubernetes.io/projected/b22cd619-3c49-40f7-98c4-9d3e07566fab-kube-api-access-zclc9\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.199385 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-horizon-tls-certs\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.199414 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b22cd619-3c49-40f7-98c4-9d3e07566fab-scripts\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.199452 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-combined-ca-bundle\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.301196 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-combined-ca-bundle\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.301299 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22cd619-3c49-40f7-98c4-9d3e07566fab-config-data\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.302658 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b22cd619-3c49-40f7-98c4-9d3e07566fab-config-data\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.302882 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22cd619-3c49-40f7-98c4-9d3e07566fab-logs\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.303133 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22cd619-3c49-40f7-98c4-9d3e07566fab-logs\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.303183 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-horizon-secret-key\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.303267 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zclc9\" (UniqueName: \"kubernetes.io/projected/b22cd619-3c49-40f7-98c4-9d3e07566fab-kube-api-access-zclc9\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.303337 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-horizon-tls-certs\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.303362 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b22cd619-3c49-40f7-98c4-9d3e07566fab-scripts\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.303849 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b22cd619-3c49-40f7-98c4-9d3e07566fab-scripts\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.312448 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-horizon-tls-certs\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.313357 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-combined-ca-bundle\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.324960 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zclc9\" (UniqueName: \"kubernetes.io/projected/b22cd619-3c49-40f7-98c4-9d3e07566fab-kube-api-access-zclc9\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.328288 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b22cd619-3c49-40f7-98c4-9d3e07566fab-horizon-secret-key\") pod \"horizon-c86f77854-vcjzs\" (UID: \"b22cd619-3c49-40f7-98c4-9d3e07566fab\") " pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.354501 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:20 crc kubenswrapper[4990]: I1003 11:29:20.608183 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-c86f77854-vcjzs"] Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.161000 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c86f77854-vcjzs" event={"ID":"b22cd619-3c49-40f7-98c4-9d3e07566fab","Type":"ContainerStarted","Data":"a6ec899b02c505bb067a93c60a8d366815df4e438d62afbbb24a8e6e60c5fb7e"} Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.161316 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c86f77854-vcjzs" event={"ID":"b22cd619-3c49-40f7-98c4-9d3e07566fab","Type":"ContainerStarted","Data":"9f01fd76f42dd21ededc8af8f8492cdee079b332008687106bb5b4d6d9c97cdb"} Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.161327 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-c86f77854-vcjzs" event={"ID":"b22cd619-3c49-40f7-98c4-9d3e07566fab","Type":"ContainerStarted","Data":"b791681035209ede9e528b240108300e1161012ce6b21670b661d5374bea187f"} Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.183910 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-c86f77854-vcjzs" podStartSLOduration=1.183893144 podStartE2EDuration="1.183893144s" podCreationTimestamp="2025-10-03 11:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:29:21.180812135 +0000 UTC m=+6342.977444002" watchObservedRunningTime="2025-10-03 11:29:21.183893144 +0000 UTC m=+6342.980524991" Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.643553 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-fkm95"] Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.644940 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fkm95" Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.655313 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-fkm95"] Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.833479 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc7fj\" (UniqueName: \"kubernetes.io/projected/a886fcb0-74f6-4fde-a081-7d56b2174761-kube-api-access-rc7fj\") pod \"heat-db-create-fkm95\" (UID: \"a886fcb0-74f6-4fde-a081-7d56b2174761\") " pod="openstack/heat-db-create-fkm95" Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.935814 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc7fj\" (UniqueName: \"kubernetes.io/projected/a886fcb0-74f6-4fde-a081-7d56b2174761-kube-api-access-rc7fj\") pod \"heat-db-create-fkm95\" (UID: \"a886fcb0-74f6-4fde-a081-7d56b2174761\") " pod="openstack/heat-db-create-fkm95" Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.963316 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc7fj\" (UniqueName: \"kubernetes.io/projected/a886fcb0-74f6-4fde-a081-7d56b2174761-kube-api-access-rc7fj\") pod \"heat-db-create-fkm95\" (UID: \"a886fcb0-74f6-4fde-a081-7d56b2174761\") " pod="openstack/heat-db-create-fkm95" Oct 03 11:29:21 crc kubenswrapper[4990]: I1003 11:29:21.979681 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fkm95" Oct 03 11:29:22 crc kubenswrapper[4990]: I1003 11:29:22.464958 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-fkm95"] Oct 03 11:29:22 crc kubenswrapper[4990]: W1003 11:29:22.469628 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda886fcb0_74f6_4fde_a081_7d56b2174761.slice/crio-ffe9ebe08553d6a8fb6ad085025ce149d5eee9c63e95ce64b43aec2610a7771f WatchSource:0}: Error finding container ffe9ebe08553d6a8fb6ad085025ce149d5eee9c63e95ce64b43aec2610a7771f: Status 404 returned error can't find the container with id ffe9ebe08553d6a8fb6ad085025ce149d5eee9c63e95ce64b43aec2610a7771f Oct 03 11:29:23 crc kubenswrapper[4990]: I1003 11:29:23.199052 4990 generic.go:334] "Generic (PLEG): container finished" podID="a886fcb0-74f6-4fde-a081-7d56b2174761" containerID="71d0ce6656858ea8aa19f8da768656c026cd07cebb8a5daa2fe32779ab262b15" exitCode=0 Oct 03 11:29:23 crc kubenswrapper[4990]: I1003 11:29:23.199416 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fkm95" event={"ID":"a886fcb0-74f6-4fde-a081-7d56b2174761","Type":"ContainerDied","Data":"71d0ce6656858ea8aa19f8da768656c026cd07cebb8a5daa2fe32779ab262b15"} Oct 03 11:29:23 crc kubenswrapper[4990]: I1003 11:29:23.199447 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fkm95" event={"ID":"a886fcb0-74f6-4fde-a081-7d56b2174761","Type":"ContainerStarted","Data":"ffe9ebe08553d6a8fb6ad085025ce149d5eee9c63e95ce64b43aec2610a7771f"} Oct 03 11:29:24 crc kubenswrapper[4990]: I1003 11:29:24.701275 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fkm95" Oct 03 11:29:24 crc kubenswrapper[4990]: I1003 11:29:24.789572 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc7fj\" (UniqueName: \"kubernetes.io/projected/a886fcb0-74f6-4fde-a081-7d56b2174761-kube-api-access-rc7fj\") pod \"a886fcb0-74f6-4fde-a081-7d56b2174761\" (UID: \"a886fcb0-74f6-4fde-a081-7d56b2174761\") " Oct 03 11:29:24 crc kubenswrapper[4990]: I1003 11:29:24.797938 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a886fcb0-74f6-4fde-a081-7d56b2174761-kube-api-access-rc7fj" (OuterVolumeSpecName: "kube-api-access-rc7fj") pod "a886fcb0-74f6-4fde-a081-7d56b2174761" (UID: "a886fcb0-74f6-4fde-a081-7d56b2174761"). InnerVolumeSpecName "kube-api-access-rc7fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:29:24 crc kubenswrapper[4990]: I1003 11:29:24.892023 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc7fj\" (UniqueName: \"kubernetes.io/projected/a886fcb0-74f6-4fde-a081-7d56b2174761-kube-api-access-rc7fj\") on node \"crc\" DevicePath \"\"" Oct 03 11:29:25 crc kubenswrapper[4990]: I1003 11:29:25.218185 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fkm95" event={"ID":"a886fcb0-74f6-4fde-a081-7d56b2174761","Type":"ContainerDied","Data":"ffe9ebe08553d6a8fb6ad085025ce149d5eee9c63e95ce64b43aec2610a7771f"} Oct 03 11:29:25 crc kubenswrapper[4990]: I1003 11:29:25.218504 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffe9ebe08553d6a8fb6ad085025ce149d5eee9c63e95ce64b43aec2610a7771f" Oct 03 11:29:25 crc kubenswrapper[4990]: I1003 11:29:25.218284 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fkm95" Oct 03 11:29:25 crc kubenswrapper[4990]: I1003 11:29:25.304282 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:29:25 crc kubenswrapper[4990]: I1003 11:29:25.304341 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:29:30 crc kubenswrapper[4990]: I1003 11:29:30.355737 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:30 crc kubenswrapper[4990]: I1003 11:29:30.356365 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:30 crc kubenswrapper[4990]: I1003 11:29:30.358377 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-c86f77854-vcjzs" podUID="b22cd619-3c49-40f7-98c4-9d3e07566fab" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.125:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.125:8443: connect: connection refused" Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.753494 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cade-account-create-9ckfv"] Oct 03 11:29:31 crc kubenswrapper[4990]: E1003 11:29:31.754625 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a886fcb0-74f6-4fde-a081-7d56b2174761" containerName="mariadb-database-create" Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.754638 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a886fcb0-74f6-4fde-a081-7d56b2174761" containerName="mariadb-database-create" Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.754864 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a886fcb0-74f6-4fde-a081-7d56b2174761" containerName="mariadb-database-create" Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.755440 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cade-account-create-9ckfv" Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.765198 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.767293 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cade-account-create-9ckfv"] Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.841748 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5spd\" (UniqueName: \"kubernetes.io/projected/313ded8f-3fe0-45db-8834-23a72733168f-kube-api-access-w5spd\") pod \"heat-cade-account-create-9ckfv\" (UID: \"313ded8f-3fe0-45db-8834-23a72733168f\") " pod="openstack/heat-cade-account-create-9ckfv" Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.944573 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5spd\" (UniqueName: \"kubernetes.io/projected/313ded8f-3fe0-45db-8834-23a72733168f-kube-api-access-w5spd\") pod \"heat-cade-account-create-9ckfv\" (UID: \"313ded8f-3fe0-45db-8834-23a72733168f\") " pod="openstack/heat-cade-account-create-9ckfv" Oct 03 11:29:31 crc kubenswrapper[4990]: I1003 11:29:31.967974 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5spd\" (UniqueName: \"kubernetes.io/projected/313ded8f-3fe0-45db-8834-23a72733168f-kube-api-access-w5spd\") pod \"heat-cade-account-create-9ckfv\" (UID: \"313ded8f-3fe0-45db-8834-23a72733168f\") " pod="openstack/heat-cade-account-create-9ckfv" Oct 03 11:29:32 crc kubenswrapper[4990]: I1003 11:29:32.077198 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cade-account-create-9ckfv" Oct 03 11:29:32 crc kubenswrapper[4990]: I1003 11:29:32.565542 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cade-account-create-9ckfv"] Oct 03 11:29:33 crc kubenswrapper[4990]: I1003 11:29:33.313283 4990 generic.go:334] "Generic (PLEG): container finished" podID="313ded8f-3fe0-45db-8834-23a72733168f" containerID="c56359e95628c4d91cb2d160306679e7765c7be89ce2210660495a3bae0ec2d3" exitCode=0 Oct 03 11:29:33 crc kubenswrapper[4990]: I1003 11:29:33.313400 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cade-account-create-9ckfv" event={"ID":"313ded8f-3fe0-45db-8834-23a72733168f","Type":"ContainerDied","Data":"c56359e95628c4d91cb2d160306679e7765c7be89ce2210660495a3bae0ec2d3"} Oct 03 11:29:33 crc kubenswrapper[4990]: I1003 11:29:33.313652 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cade-account-create-9ckfv" event={"ID":"313ded8f-3fe0-45db-8834-23a72733168f","Type":"ContainerStarted","Data":"e3157401421f454ec48ab32645aa4f23f9ee4c9b83760ccee1a5c425affe1706"} Oct 03 11:29:34 crc kubenswrapper[4990]: I1003 11:29:34.806113 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cade-account-create-9ckfv" Oct 03 11:29:34 crc kubenswrapper[4990]: I1003 11:29:34.906237 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5spd\" (UniqueName: \"kubernetes.io/projected/313ded8f-3fe0-45db-8834-23a72733168f-kube-api-access-w5spd\") pod \"313ded8f-3fe0-45db-8834-23a72733168f\" (UID: \"313ded8f-3fe0-45db-8834-23a72733168f\") " Oct 03 11:29:34 crc kubenswrapper[4990]: I1003 11:29:34.912184 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313ded8f-3fe0-45db-8834-23a72733168f-kube-api-access-w5spd" (OuterVolumeSpecName: "kube-api-access-w5spd") pod "313ded8f-3fe0-45db-8834-23a72733168f" (UID: "313ded8f-3fe0-45db-8834-23a72733168f"). InnerVolumeSpecName "kube-api-access-w5spd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:29:35 crc kubenswrapper[4990]: I1003 11:29:35.008434 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5spd\" (UniqueName: \"kubernetes.io/projected/313ded8f-3fe0-45db-8834-23a72733168f-kube-api-access-w5spd\") on node \"crc\" DevicePath \"\"" Oct 03 11:29:35 crc kubenswrapper[4990]: I1003 11:29:35.335135 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cade-account-create-9ckfv" event={"ID":"313ded8f-3fe0-45db-8834-23a72733168f","Type":"ContainerDied","Data":"e3157401421f454ec48ab32645aa4f23f9ee4c9b83760ccee1a5c425affe1706"} Oct 03 11:29:35 crc kubenswrapper[4990]: I1003 11:29:35.335174 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3157401421f454ec48ab32645aa4f23f9ee4c9b83760ccee1a5c425affe1706" Oct 03 11:29:35 crc kubenswrapper[4990]: I1003 11:29:35.335222 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cade-account-create-9ckfv" Oct 03 11:29:36 crc kubenswrapper[4990]: I1003 11:29:36.921369 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-54f7b"] Oct 03 11:29:36 crc kubenswrapper[4990]: E1003 11:29:36.922333 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313ded8f-3fe0-45db-8834-23a72733168f" containerName="mariadb-account-create" Oct 03 11:29:36 crc kubenswrapper[4990]: I1003 11:29:36.922353 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="313ded8f-3fe0-45db-8834-23a72733168f" containerName="mariadb-account-create" Oct 03 11:29:36 crc kubenswrapper[4990]: I1003 11:29:36.922633 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="313ded8f-3fe0-45db-8834-23a72733168f" containerName="mariadb-account-create" Oct 03 11:29:36 crc kubenswrapper[4990]: I1003 11:29:36.923458 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:36 crc kubenswrapper[4990]: I1003 11:29:36.926470 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-64qwt" Oct 03 11:29:36 crc kubenswrapper[4990]: I1003 11:29:36.926989 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 03 11:29:36 crc kubenswrapper[4990]: I1003 11:29:36.935824 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-54f7b"] Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.056978 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-config-data\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.057212 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srlnd\" (UniqueName: \"kubernetes.io/projected/bfc24b10-6ae7-478b-ad51-7339be03dbcb-kube-api-access-srlnd\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.057537 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-combined-ca-bundle\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.160403 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srlnd\" (UniqueName: \"kubernetes.io/projected/bfc24b10-6ae7-478b-ad51-7339be03dbcb-kube-api-access-srlnd\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.160505 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-combined-ca-bundle\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.160653 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-config-data\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.165433 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-combined-ca-bundle\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.166494 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-config-data\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.181406 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srlnd\" (UniqueName: \"kubernetes.io/projected/bfc24b10-6ae7-478b-ad51-7339be03dbcb-kube-api-access-srlnd\") pod \"heat-db-sync-54f7b\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.259035 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:37 crc kubenswrapper[4990]: I1003 11:29:37.889225 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-54f7b"] Oct 03 11:29:38 crc kubenswrapper[4990]: I1003 11:29:38.375146 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-54f7b" event={"ID":"bfc24b10-6ae7-478b-ad51-7339be03dbcb","Type":"ContainerStarted","Data":"8958ec697743b1fefc08b37a6f1d4498889e2c5ffed30cbe4c416b4ee92bf820"} Oct 03 11:29:42 crc kubenswrapper[4990]: I1003 11:29:42.217619 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:43 crc kubenswrapper[4990]: I1003 11:29:43.889650 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-c86f77854-vcjzs" Oct 03 11:29:43 crc kubenswrapper[4990]: I1003 11:29:43.978368 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c5fb6fc74-sw79s"] Oct 03 11:29:43 crc kubenswrapper[4990]: I1003 11:29:43.980692 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c5fb6fc74-sw79s" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon-log" containerID="cri-o://9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98" gracePeriod=30 Oct 03 11:29:43 crc kubenswrapper[4990]: I1003 11:29:43.982992 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c5fb6fc74-sw79s" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon" containerID="cri-o://e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed" gracePeriod=30 Oct 03 11:29:46 crc kubenswrapper[4990]: I1003 11:29:46.483381 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-54f7b" event={"ID":"bfc24b10-6ae7-478b-ad51-7339be03dbcb","Type":"ContainerStarted","Data":"9ec57d983c322df18e215edc53c421a052eb2202debaa897a20ed35052f3883f"} Oct 03 11:29:46 crc kubenswrapper[4990]: I1003 11:29:46.514259 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-54f7b" podStartSLOduration=2.977875079 podStartE2EDuration="10.514207731s" podCreationTimestamp="2025-10-03 11:29:36 +0000 UTC" firstStartedPulling="2025-10-03 11:29:37.888316225 +0000 UTC m=+6359.684948082" lastFinishedPulling="2025-10-03 11:29:45.424648877 +0000 UTC m=+6367.221280734" observedRunningTime="2025-10-03 11:29:46.503964027 +0000 UTC m=+6368.300595894" watchObservedRunningTime="2025-10-03 11:29:46.514207731 +0000 UTC m=+6368.310839608" Oct 03 11:29:47 crc kubenswrapper[4990]: I1003 11:29:47.497318 4990 generic.go:334] "Generic (PLEG): container finished" podID="bfc24b10-6ae7-478b-ad51-7339be03dbcb" containerID="9ec57d983c322df18e215edc53c421a052eb2202debaa897a20ed35052f3883f" exitCode=0 Oct 03 11:29:47 crc kubenswrapper[4990]: I1003 11:29:47.497412 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-54f7b" event={"ID":"bfc24b10-6ae7-478b-ad51-7339be03dbcb","Type":"ContainerDied","Data":"9ec57d983c322df18e215edc53c421a052eb2202debaa897a20ed35052f3883f"} Oct 03 11:29:47 crc kubenswrapper[4990]: I1003 11:29:47.500203 4990 generic.go:334] "Generic (PLEG): container finished" podID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerID="e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed" exitCode=0 Oct 03 11:29:47 crc kubenswrapper[4990]: I1003 11:29:47.500238 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fb6fc74-sw79s" event={"ID":"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d","Type":"ContainerDied","Data":"e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed"} Oct 03 11:29:48 crc kubenswrapper[4990]: I1003 11:29:48.932261 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.082059 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-config-data\") pod \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.083204 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srlnd\" (UniqueName: \"kubernetes.io/projected/bfc24b10-6ae7-478b-ad51-7339be03dbcb-kube-api-access-srlnd\") pod \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.083391 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-combined-ca-bundle\") pod \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\" (UID: \"bfc24b10-6ae7-478b-ad51-7339be03dbcb\") " Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.096711 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc24b10-6ae7-478b-ad51-7339be03dbcb-kube-api-access-srlnd" (OuterVolumeSpecName: "kube-api-access-srlnd") pod "bfc24b10-6ae7-478b-ad51-7339be03dbcb" (UID: "bfc24b10-6ae7-478b-ad51-7339be03dbcb"). InnerVolumeSpecName "kube-api-access-srlnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.118733 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfc24b10-6ae7-478b-ad51-7339be03dbcb" (UID: "bfc24b10-6ae7-478b-ad51-7339be03dbcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.173190 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-config-data" (OuterVolumeSpecName: "config-data") pod "bfc24b10-6ae7-478b-ad51-7339be03dbcb" (UID: "bfc24b10-6ae7-478b-ad51-7339be03dbcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.185623 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srlnd\" (UniqueName: \"kubernetes.io/projected/bfc24b10-6ae7-478b-ad51-7339be03dbcb-kube-api-access-srlnd\") on node \"crc\" DevicePath \"\"" Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.185657 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.185667 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc24b10-6ae7-478b-ad51-7339be03dbcb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.535144 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-54f7b" event={"ID":"bfc24b10-6ae7-478b-ad51-7339be03dbcb","Type":"ContainerDied","Data":"8958ec697743b1fefc08b37a6f1d4498889e2c5ffed30cbe4c416b4ee92bf820"} Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.535197 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8958ec697743b1fefc08b37a6f1d4498889e2c5ffed30cbe4c416b4ee92bf820" Oct 03 11:29:49 crc kubenswrapper[4990]: I1003 11:29:49.535297 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-54f7b" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.664297 4990 scope.go:117] "RemoveContainer" containerID="eac84ab12b81e3fe136aac344cec54856586efb28d93c8e40eb507533c325e79" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.680590 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6bc484dfb4-kr56g"] Oct 03 11:29:50 crc kubenswrapper[4990]: E1003 11:29:50.680993 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc24b10-6ae7-478b-ad51-7339be03dbcb" containerName="heat-db-sync" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.681007 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc24b10-6ae7-478b-ad51-7339be03dbcb" containerName="heat-db-sync" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.681236 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc24b10-6ae7-478b-ad51-7339be03dbcb" containerName="heat-db-sync" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.681866 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.698393 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-64qwt" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.698422 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.698741 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.700305 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6bc484dfb4-kr56g"] Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.799586 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-c6c5fff94-4vqjd"] Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.801125 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.802922 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.831750 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-c6c5fff94-4vqjd"] Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.832908 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-combined-ca-bundle\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.832948 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data-custom\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.833007 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.833067 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k6cg\" (UniqueName: \"kubernetes.io/projected/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-kube-api-access-7k6cg\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.937442 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k6cg\" (UniqueName: \"kubernetes.io/projected/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-kube-api-access-7k6cg\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.937553 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.937607 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data-custom\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.937639 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-combined-ca-bundle\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.937663 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data-custom\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.937679 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-combined-ca-bundle\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.937722 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkgs\" (UniqueName: \"kubernetes.io/projected/a778e77c-5e84-4479-b318-2aa18673a832-kube-api-access-vzkgs\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.937749 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.938604 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-67cc696c64-44ps6"] Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.939775 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.942838 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.950811 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data-custom\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.953076 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-67cc696c64-44ps6"] Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.959834 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-combined-ca-bundle\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.964230 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:50 crc kubenswrapper[4990]: I1003 11:29:50.972039 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k6cg\" (UniqueName: \"kubernetes.io/projected/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-kube-api-access-7k6cg\") pod \"heat-engine-6bc484dfb4-kr56g\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.042371 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.042653 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-combined-ca-bundle\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.042815 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data-custom\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.042923 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data-custom\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.044160 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-combined-ca-bundle\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.044408 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2qx\" (UniqueName: \"kubernetes.io/projected/951440f6-bb9c-4d4b-9f17-dcc280463405-kube-api-access-2q2qx\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.044525 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkgs\" (UniqueName: \"kubernetes.io/projected/a778e77c-5e84-4479-b318-2aa18673a832-kube-api-access-vzkgs\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.044599 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.048022 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.048302 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-combined-ca-bundle\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.049896 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data-custom\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.065654 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkgs\" (UniqueName: \"kubernetes.io/projected/a778e77c-5e84-4479-b318-2aa18673a832-kube-api-access-vzkgs\") pod \"heat-api-c6c5fff94-4vqjd\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.066120 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.147198 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.147338 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-combined-ca-bundle\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.147381 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data-custom\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.147425 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2qx\" (UniqueName: \"kubernetes.io/projected/951440f6-bb9c-4d4b-9f17-dcc280463405-kube-api-access-2q2qx\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.151541 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.154937 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data-custom\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.155385 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.157414 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-combined-ca-bundle\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.171077 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2qx\" (UniqueName: \"kubernetes.io/projected/951440f6-bb9c-4d4b-9f17-dcc280463405-kube-api-access-2q2qx\") pod \"heat-cfnapi-67cc696c64-44ps6\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.431660 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.542446 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6bc484dfb4-kr56g"] Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.647033 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c5fb6fc74-sw79s" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.122:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.122:8443: connect: connection refused" Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.695310 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-c6c5fff94-4vqjd"] Oct 03 11:29:51 crc kubenswrapper[4990]: W1003 11:29:51.779081 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda778e77c_5e84_4479_b318_2aa18673a832.slice/crio-cc371254e3145aff2ec98f780de45f72be0e51b91fe1886dac0510eb0ffe77f9 WatchSource:0}: Error finding container cc371254e3145aff2ec98f780de45f72be0e51b91fe1886dac0510eb0ffe77f9: Status 404 returned error can't find the container with id cc371254e3145aff2ec98f780de45f72be0e51b91fe1886dac0510eb0ffe77f9 Oct 03 11:29:51 crc kubenswrapper[4990]: I1003 11:29:51.928753 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-67cc696c64-44ps6"] Oct 03 11:29:51 crc kubenswrapper[4990]: W1003 11:29:51.947827 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod951440f6_bb9c_4d4b_9f17_dcc280463405.slice/crio-46a8935a4b95054ab49bb9d489a7a27d656f714d3ab303a08a10b85685768985 WatchSource:0}: Error finding container 46a8935a4b95054ab49bb9d489a7a27d656f714d3ab303a08a10b85685768985: Status 404 returned error can't find the container with id 46a8935a4b95054ab49bb9d489a7a27d656f714d3ab303a08a10b85685768985 Oct 03 11:29:52 crc kubenswrapper[4990]: I1003 11:29:52.574618 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67cc696c64-44ps6" event={"ID":"951440f6-bb9c-4d4b-9f17-dcc280463405","Type":"ContainerStarted","Data":"46a8935a4b95054ab49bb9d489a7a27d656f714d3ab303a08a10b85685768985"} Oct 03 11:29:52 crc kubenswrapper[4990]: I1003 11:29:52.579635 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-c6c5fff94-4vqjd" event={"ID":"a778e77c-5e84-4479-b318-2aa18673a832","Type":"ContainerStarted","Data":"cc371254e3145aff2ec98f780de45f72be0e51b91fe1886dac0510eb0ffe77f9"} Oct 03 11:29:52 crc kubenswrapper[4990]: I1003 11:29:52.581197 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bc484dfb4-kr56g" event={"ID":"0f8e9d9f-6e68-46bc-bf9f-fc536507111b","Type":"ContainerStarted","Data":"6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea"} Oct 03 11:29:52 crc kubenswrapper[4990]: I1003 11:29:52.581228 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bc484dfb4-kr56g" event={"ID":"0f8e9d9f-6e68-46bc-bf9f-fc536507111b","Type":"ContainerStarted","Data":"583bd78a6f07e51c38e439d3bcd1027ab1539b2e96a5cf1708d2991d7d556b34"} Oct 03 11:29:52 crc kubenswrapper[4990]: I1003 11:29:52.582587 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:29:52 crc kubenswrapper[4990]: I1003 11:29:52.610109 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6bc484dfb4-kr56g" podStartSLOduration=2.610084367 podStartE2EDuration="2.610084367s" podCreationTimestamp="2025-10-03 11:29:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:29:52.596918787 +0000 UTC m=+6374.393550644" watchObservedRunningTime="2025-10-03 11:29:52.610084367 +0000 UTC m=+6374.406716234" Oct 03 11:29:54 crc kubenswrapper[4990]: I1003 11:29:54.607388 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67cc696c64-44ps6" event={"ID":"951440f6-bb9c-4d4b-9f17-dcc280463405","Type":"ContainerStarted","Data":"f834e5569314a8c33ef6f718e0e3159ddb22f53aed166cab620eb2b28ba31c7e"} Oct 03 11:29:54 crc kubenswrapper[4990]: I1003 11:29:54.607959 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:29:54 crc kubenswrapper[4990]: I1003 11:29:54.611028 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-c6c5fff94-4vqjd" event={"ID":"a778e77c-5e84-4479-b318-2aa18673a832","Type":"ContainerStarted","Data":"c40d8fedb74587e1d6a32db26b287e1d1eb03fc3f575966e580af4a99eb4fd40"} Oct 03 11:29:54 crc kubenswrapper[4990]: I1003 11:29:54.611115 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:29:54 crc kubenswrapper[4990]: I1003 11:29:54.624724 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-67cc696c64-44ps6" podStartSLOduration=2.416894354 podStartE2EDuration="4.624702451s" podCreationTimestamp="2025-10-03 11:29:50 +0000 UTC" firstStartedPulling="2025-10-03 11:29:51.949671142 +0000 UTC m=+6373.746302989" lastFinishedPulling="2025-10-03 11:29:54.157479229 +0000 UTC m=+6375.954111086" observedRunningTime="2025-10-03 11:29:54.622278958 +0000 UTC m=+6376.418910835" watchObservedRunningTime="2025-10-03 11:29:54.624702451 +0000 UTC m=+6376.421334308" Oct 03 11:29:54 crc kubenswrapper[4990]: I1003 11:29:54.643429 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-c6c5fff94-4vqjd" podStartSLOduration=2.267392327 podStartE2EDuration="4.643408703s" podCreationTimestamp="2025-10-03 11:29:50 +0000 UTC" firstStartedPulling="2025-10-03 11:29:51.784632345 +0000 UTC m=+6373.581264202" lastFinishedPulling="2025-10-03 11:29:54.160648721 +0000 UTC m=+6375.957280578" observedRunningTime="2025-10-03 11:29:54.642410188 +0000 UTC m=+6376.439042055" watchObservedRunningTime="2025-10-03 11:29:54.643408703 +0000 UTC m=+6376.440040570" Oct 03 11:29:55 crc kubenswrapper[4990]: I1003 11:29:55.304008 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:29:55 crc kubenswrapper[4990]: I1003 11:29:55.304060 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:29:57 crc kubenswrapper[4990]: I1003 11:29:57.975937 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6c667dbb67-dlps2"] Oct 03 11:29:57 crc kubenswrapper[4990]: I1003 11:29:57.977652 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:57 crc kubenswrapper[4990]: I1003 11:29:57.989624 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6c667dbb67-dlps2"] Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.026367 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7654d77b97-sxzhh"] Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.028801 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.043648 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7654d77b97-sxzhh"] Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.058572 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6888547c6d-9pfmf"] Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.059948 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.080535 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6888547c6d-9pfmf"] Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.104013 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.104065 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-combined-ca-bundle\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.104090 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data-custom\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.104109 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqfx9\" (UniqueName: \"kubernetes.io/projected/8ca5d372-6013-40e6-aee7-e541d60e99d1-kube-api-access-fqfx9\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.104130 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-config-data\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.104166 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-combined-ca-bundle\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.104186 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrp5k\" (UniqueName: \"kubernetes.io/projected/6f7c56db-6bb3-4a55-a41c-a8140008baf6-kube-api-access-nrp5k\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.104207 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-config-data-custom\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.205884 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.205992 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-combined-ca-bundle\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206188 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgs29\" (UniqueName: \"kubernetes.io/projected/daca24d0-70d5-4adb-b77a-6b8499d7363e-kube-api-access-bgs29\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206299 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206325 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-combined-ca-bundle\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206371 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data-custom\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206389 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqfx9\" (UniqueName: \"kubernetes.io/projected/8ca5d372-6013-40e6-aee7-e541d60e99d1-kube-api-access-fqfx9\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206433 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-config-data\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206539 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-combined-ca-bundle\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206572 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrp5k\" (UniqueName: \"kubernetes.io/projected/6f7c56db-6bb3-4a55-a41c-a8140008baf6-kube-api-access-nrp5k\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206601 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-config-data-custom\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.206700 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data-custom\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.212868 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-config-data-custom\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.212883 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.213239 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-combined-ca-bundle\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.213465 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data-custom\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.213575 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f7c56db-6bb3-4a55-a41c-a8140008baf6-config-data\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.217053 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-combined-ca-bundle\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.223974 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqfx9\" (UniqueName: \"kubernetes.io/projected/8ca5d372-6013-40e6-aee7-e541d60e99d1-kube-api-access-fqfx9\") pod \"heat-api-7654d77b97-sxzhh\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.229852 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrp5k\" (UniqueName: \"kubernetes.io/projected/6f7c56db-6bb3-4a55-a41c-a8140008baf6-kube-api-access-nrp5k\") pod \"heat-engine-6c667dbb67-dlps2\" (UID: \"6f7c56db-6bb3-4a55-a41c-a8140008baf6\") " pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.300177 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.308639 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-combined-ca-bundle\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.308704 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgs29\" (UniqueName: \"kubernetes.io/projected/daca24d0-70d5-4adb-b77a-6b8499d7363e-kube-api-access-bgs29\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.308836 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data-custom\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.308869 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.313026 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-combined-ca-bundle\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.313525 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.314146 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data-custom\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.328395 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgs29\" (UniqueName: \"kubernetes.io/projected/daca24d0-70d5-4adb-b77a-6b8499d7363e-kube-api-access-bgs29\") pod \"heat-cfnapi-6888547c6d-9pfmf\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.375953 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.387636 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:58 crc kubenswrapper[4990]: W1003 11:29:58.833381 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f7c56db_6bb3_4a55_a41c_a8140008baf6.slice/crio-555998a3caa37a93b7a04566efccc5922c470b47cf18d15d65cd9aa1d1aa4c24 WatchSource:0}: Error finding container 555998a3caa37a93b7a04566efccc5922c470b47cf18d15d65cd9aa1d1aa4c24: Status 404 returned error can't find the container with id 555998a3caa37a93b7a04566efccc5922c470b47cf18d15d65cd9aa1d1aa4c24 Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.838398 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6c667dbb67-dlps2"] Oct 03 11:29:58 crc kubenswrapper[4990]: W1003 11:29:58.997382 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ca5d372_6013_40e6_aee7_e541d60e99d1.slice/crio-c438e65c786e01b05c11585fa969620327f4dd21b692710d538af4e79a6e9e8e WatchSource:0}: Error finding container c438e65c786e01b05c11585fa969620327f4dd21b692710d538af4e79a6e9e8e: Status 404 returned error can't find the container with id c438e65c786e01b05c11585fa969620327f4dd21b692710d538af4e79a6e9e8e Oct 03 11:29:58 crc kubenswrapper[4990]: I1003 11:29:58.999442 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7654d77b97-sxzhh"] Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.024217 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6888547c6d-9pfmf"] Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.228657 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-67cc696c64-44ps6"] Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.230602 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-67cc696c64-44ps6" podUID="951440f6-bb9c-4d4b-9f17-dcc280463405" containerName="heat-cfnapi" containerID="cri-o://f834e5569314a8c33ef6f718e0e3159ddb22f53aed166cab620eb2b28ba31c7e" gracePeriod=60 Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.246861 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-c6c5fff94-4vqjd"] Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.249606 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-c6c5fff94-4vqjd" podUID="a778e77c-5e84-4479-b318-2aa18673a832" containerName="heat-api" containerID="cri-o://c40d8fedb74587e1d6a32db26b287e1d1eb03fc3f575966e580af4a99eb4fd40" gracePeriod=60 Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.274548 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-9b7c9fb-mfzpc"] Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.276220 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.279781 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.280023 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.311057 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9b7c9fb-mfzpc"] Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.334806 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7db7b4869c-gx756"] Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.336465 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.337469 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-combined-ca-bundle\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.339032 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.339324 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.339503 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-internal-tls-certs\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.339594 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwt84\" (UniqueName: \"kubernetes.io/projected/7e2cb064-93e7-4455-b395-ef9e266be90a-kube-api-access-lwt84\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.339662 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-public-tls-certs\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.339696 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-config-data-custom\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.339737 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-config-data\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.354733 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7db7b4869c-gx756"] Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.357543 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-c6c5fff94-4vqjd" podUID="a778e77c-5e84-4479-b318-2aa18673a832" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.130:8004/healthcheck\": EOF" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.441889 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-internal-tls-certs\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442188 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwt84\" (UniqueName: \"kubernetes.io/projected/7e2cb064-93e7-4455-b395-ef9e266be90a-kube-api-access-lwt84\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442242 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-public-tls-certs\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442266 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzt4\" (UniqueName: \"kubernetes.io/projected/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-kube-api-access-bkzt4\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442292 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-config-data-custom\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442314 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-config-data-custom\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442338 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-config-data\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442375 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-combined-ca-bundle\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442421 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-config-data\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442449 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-internal-tls-certs\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442481 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-combined-ca-bundle\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.442528 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-public-tls-certs\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.448495 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-internal-tls-certs\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.448495 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-public-tls-certs\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.450362 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-config-data\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.453679 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-combined-ca-bundle\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.461714 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e2cb064-93e7-4455-b395-ef9e266be90a-config-data-custom\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.466253 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwt84\" (UniqueName: \"kubernetes.io/projected/7e2cb064-93e7-4455-b395-ef9e266be90a-kube-api-access-lwt84\") pod \"heat-api-9b7c9fb-mfzpc\" (UID: \"7e2cb064-93e7-4455-b395-ef9e266be90a\") " pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.544237 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-combined-ca-bundle\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.544306 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-config-data\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.544338 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-internal-tls-certs\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.544383 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-public-tls-certs\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.544500 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzt4\" (UniqueName: \"kubernetes.io/projected/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-kube-api-access-bkzt4\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.544573 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-config-data-custom\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.548591 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-config-data\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.551440 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-internal-tls-certs\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.552528 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-config-data-custom\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.554551 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-public-tls-certs\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.557904 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-combined-ca-bundle\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.573979 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzt4\" (UniqueName: \"kubernetes.io/projected/aaa8edfc-e5aa-4d3c-bee3-06249d9623ad-kube-api-access-bkzt4\") pod \"heat-cfnapi-7db7b4869c-gx756\" (UID: \"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad\") " pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.661656 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-67cc696c64-44ps6" podUID="951440f6-bb9c-4d4b-9f17-dcc280463405" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.131:8000/healthcheck\": read tcp 10.217.0.2:34570->10.217.1.131:8000: read: connection reset by peer" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.665003 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c667dbb67-dlps2" event={"ID":"6f7c56db-6bb3-4a55-a41c-a8140008baf6","Type":"ContainerStarted","Data":"906b8d6336d6f3f9185802c16dc37ee08d7381acd58f037295b3aad0aad72bc2"} Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.665077 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6c667dbb67-dlps2" event={"ID":"6f7c56db-6bb3-4a55-a41c-a8140008baf6","Type":"ContainerStarted","Data":"555998a3caa37a93b7a04566efccc5922c470b47cf18d15d65cd9aa1d1aa4c24"} Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.666426 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.669912 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" event={"ID":"daca24d0-70d5-4adb-b77a-6b8499d7363e","Type":"ContainerStarted","Data":"ed4323f5f871c0c46c431c4814e37c6d1b135f09c5b3f0ed9c82510cd1ae67d6"} Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.669955 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" event={"ID":"daca24d0-70d5-4adb-b77a-6b8499d7363e","Type":"ContainerStarted","Data":"65edc8ccd189b93826bd23eaa6b7373376427152e2be854b81c927d2dfeea9ab"} Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.670408 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.675152 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7654d77b97-sxzhh" event={"ID":"8ca5d372-6013-40e6-aee7-e541d60e99d1","Type":"ContainerStarted","Data":"0a2718100d2f28d540c310e4edfc61c6c47fec185b642b2ab229e620c0a4e291"} Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.675205 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7654d77b97-sxzhh" event={"ID":"8ca5d372-6013-40e6-aee7-e541d60e99d1","Type":"ContainerStarted","Data":"c438e65c786e01b05c11585fa969620327f4dd21b692710d538af4e79a6e9e8e"} Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.676342 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.695208 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6c667dbb67-dlps2" podStartSLOduration=2.695186949 podStartE2EDuration="2.695186949s" podCreationTimestamp="2025-10-03 11:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:29:59.685969331 +0000 UTC m=+6381.482601188" watchObservedRunningTime="2025-10-03 11:29:59.695186949 +0000 UTC m=+6381.491818816" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.699837 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.717403 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7654d77b97-sxzhh" podStartSLOduration=2.717382442 podStartE2EDuration="2.717382442s" podCreationTimestamp="2025-10-03 11:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:29:59.712093955 +0000 UTC m=+6381.508725812" watchObservedRunningTime="2025-10-03 11:29:59.717382442 +0000 UTC m=+6381.514014299" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.723771 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:29:59 crc kubenswrapper[4990]: I1003 11:29:59.734886 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" podStartSLOduration=1.734856412 podStartE2EDuration="1.734856412s" podCreationTimestamp="2025-10-03 11:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:29:59.730236643 +0000 UTC m=+6381.526868520" watchObservedRunningTime="2025-10-03 11:29:59.734856412 +0000 UTC m=+6381.531488289" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.144372 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz"] Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.146381 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.149262 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.149424 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.162649 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz"] Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.226535 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7db7b4869c-gx756"] Oct 03 11:30:00 crc kubenswrapper[4990]: W1003 11:30:00.231575 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaa8edfc_e5aa_4d3c_bee3_06249d9623ad.slice/crio-1ba0f99cb46f0b1ab7b079ffd63f291f4b17fa5eb51617af4071fcfdf7108ed7 WatchSource:0}: Error finding container 1ba0f99cb46f0b1ab7b079ffd63f291f4b17fa5eb51617af4071fcfdf7108ed7: Status 404 returned error can't find the container with id 1ba0f99cb46f0b1ab7b079ffd63f291f4b17fa5eb51617af4071fcfdf7108ed7 Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.274087 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmcs\" (UniqueName: \"kubernetes.io/projected/4070aa48-04e2-4498-a16d-7f20675e55c7-kube-api-access-cwmcs\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.274545 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4070aa48-04e2-4498-a16d-7f20675e55c7-secret-volume\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.274737 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4070aa48-04e2-4498-a16d-7f20675e55c7-config-volume\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.304123 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9b7c9fb-mfzpc"] Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.376581 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmcs\" (UniqueName: \"kubernetes.io/projected/4070aa48-04e2-4498-a16d-7f20675e55c7-kube-api-access-cwmcs\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.376787 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4070aa48-04e2-4498-a16d-7f20675e55c7-secret-volume\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.376857 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4070aa48-04e2-4498-a16d-7f20675e55c7-config-volume\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.377661 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4070aa48-04e2-4498-a16d-7f20675e55c7-config-volume\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.381155 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4070aa48-04e2-4498-a16d-7f20675e55c7-secret-volume\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.393894 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmcs\" (UniqueName: \"kubernetes.io/projected/4070aa48-04e2-4498-a16d-7f20675e55c7-kube-api-access-cwmcs\") pod \"collect-profiles-29324850-sjvzz\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.483648 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.695236 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b7c9fb-mfzpc" event={"ID":"7e2cb064-93e7-4455-b395-ef9e266be90a","Type":"ContainerStarted","Data":"d52c351e2ec2a46247baffff92434359e0759cff5b860f6f30a9977246da5bb2"} Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.696779 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7db7b4869c-gx756" event={"ID":"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad","Type":"ContainerStarted","Data":"1ba0f99cb46f0b1ab7b079ffd63f291f4b17fa5eb51617af4071fcfdf7108ed7"} Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.698370 4990 generic.go:334] "Generic (PLEG): container finished" podID="daca24d0-70d5-4adb-b77a-6b8499d7363e" containerID="ed4323f5f871c0c46c431c4814e37c6d1b135f09c5b3f0ed9c82510cd1ae67d6" exitCode=1 Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.698418 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" event={"ID":"daca24d0-70d5-4adb-b77a-6b8499d7363e","Type":"ContainerDied","Data":"ed4323f5f871c0c46c431c4814e37c6d1b135f09c5b3f0ed9c82510cd1ae67d6"} Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.699066 4990 scope.go:117] "RemoveContainer" containerID="ed4323f5f871c0c46c431c4814e37c6d1b135f09c5b3f0ed9c82510cd1ae67d6" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.712479 4990 generic.go:334] "Generic (PLEG): container finished" podID="8ca5d372-6013-40e6-aee7-e541d60e99d1" containerID="0a2718100d2f28d540c310e4edfc61c6c47fec185b642b2ab229e620c0a4e291" exitCode=1 Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.712666 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7654d77b97-sxzhh" event={"ID":"8ca5d372-6013-40e6-aee7-e541d60e99d1","Type":"ContainerDied","Data":"0a2718100d2f28d540c310e4edfc61c6c47fec185b642b2ab229e620c0a4e291"} Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.713190 4990 scope.go:117] "RemoveContainer" containerID="0a2718100d2f28d540c310e4edfc61c6c47fec185b642b2ab229e620c0a4e291" Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.732814 4990 generic.go:334] "Generic (PLEG): container finished" podID="951440f6-bb9c-4d4b-9f17-dcc280463405" containerID="f834e5569314a8c33ef6f718e0e3159ddb22f53aed166cab620eb2b28ba31c7e" exitCode=0 Oct 03 11:30:00 crc kubenswrapper[4990]: I1003 11:30:00.734603 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67cc696c64-44ps6" event={"ID":"951440f6-bb9c-4d4b-9f17-dcc280463405","Type":"ContainerDied","Data":"f834e5569314a8c33ef6f718e0e3159ddb22f53aed166cab620eb2b28ba31c7e"} Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.046009 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-6cq5h"] Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.053928 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-6cq5h"] Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.067522 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz"] Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.520674 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.620301 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q2qx\" (UniqueName: \"kubernetes.io/projected/951440f6-bb9c-4d4b-9f17-dcc280463405-kube-api-access-2q2qx\") pod \"951440f6-bb9c-4d4b-9f17-dcc280463405\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.620385 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data\") pod \"951440f6-bb9c-4d4b-9f17-dcc280463405\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.620616 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data-custom\") pod \"951440f6-bb9c-4d4b-9f17-dcc280463405\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.620648 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-combined-ca-bundle\") pod \"951440f6-bb9c-4d4b-9f17-dcc280463405\" (UID: \"951440f6-bb9c-4d4b-9f17-dcc280463405\") " Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.625736 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/951440f6-bb9c-4d4b-9f17-dcc280463405-kube-api-access-2q2qx" (OuterVolumeSpecName: "kube-api-access-2q2qx") pod "951440f6-bb9c-4d4b-9f17-dcc280463405" (UID: "951440f6-bb9c-4d4b-9f17-dcc280463405"). InnerVolumeSpecName "kube-api-access-2q2qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.647279 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c5fb6fc74-sw79s" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.122:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.122:8443: connect: connection refused" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.651685 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "951440f6-bb9c-4d4b-9f17-dcc280463405" (UID: "951440f6-bb9c-4d4b-9f17-dcc280463405"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.676962 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "951440f6-bb9c-4d4b-9f17-dcc280463405" (UID: "951440f6-bb9c-4d4b-9f17-dcc280463405"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.701659 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data" (OuterVolumeSpecName: "config-data") pod "951440f6-bb9c-4d4b-9f17-dcc280463405" (UID: "951440f6-bb9c-4d4b-9f17-dcc280463405"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.722672 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.722714 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.722728 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951440f6-bb9c-4d4b-9f17-dcc280463405-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.722739 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q2qx\" (UniqueName: \"kubernetes.io/projected/951440f6-bb9c-4d4b-9f17-dcc280463405-kube-api-access-2q2qx\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.765174 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-67cc696c64-44ps6" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.766492 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-67cc696c64-44ps6" event={"ID":"951440f6-bb9c-4d4b-9f17-dcc280463405","Type":"ContainerDied","Data":"46a8935a4b95054ab49bb9d489a7a27d656f714d3ab303a08a10b85685768985"} Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.766559 4990 scope.go:117] "RemoveContainer" containerID="f834e5569314a8c33ef6f718e0e3159ddb22f53aed166cab620eb2b28ba31c7e" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.771049 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b7c9fb-mfzpc" event={"ID":"7e2cb064-93e7-4455-b395-ef9e266be90a","Type":"ContainerStarted","Data":"571306422cf368670d75fc9775d329349a6f9692a02f42daf58d53429eb1a71f"} Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.771127 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.777081 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7db7b4869c-gx756" event={"ID":"aaa8edfc-e5aa-4d3c-bee3-06249d9623ad","Type":"ContainerStarted","Data":"4a16e73f325efaf2130a9ac5000c643d53779e26bbababea0594e0e7ce4de0fd"} Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.777501 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.788754 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-9b7c9fb-mfzpc" podStartSLOduration=2.788737819 podStartE2EDuration="2.788737819s" podCreationTimestamp="2025-10-03 11:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:30:01.788049931 +0000 UTC m=+6383.584681778" watchObservedRunningTime="2025-10-03 11:30:01.788737819 +0000 UTC m=+6383.585369676" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.794148 4990 generic.go:334] "Generic (PLEG): container finished" podID="daca24d0-70d5-4adb-b77a-6b8499d7363e" containerID="c3f64c9a03fbcd81201cb1ae362fe7643cb881897685d3fc6936526352fb9109" exitCode=1 Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.794251 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" event={"ID":"daca24d0-70d5-4adb-b77a-6b8499d7363e","Type":"ContainerDied","Data":"c3f64c9a03fbcd81201cb1ae362fe7643cb881897685d3fc6936526352fb9109"} Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.795096 4990 scope.go:117] "RemoveContainer" containerID="c3f64c9a03fbcd81201cb1ae362fe7643cb881897685d3fc6936526352fb9109" Oct 03 11:30:01 crc kubenswrapper[4990]: E1003 11:30:01.795396 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6888547c6d-9pfmf_openstack(daca24d0-70d5-4adb-b77a-6b8499d7363e)\"" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.798110 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" event={"ID":"4070aa48-04e2-4498-a16d-7f20675e55c7","Type":"ContainerStarted","Data":"db2420606c169afcbcab781b7a7639f76f63b010145ab959711f2cd1a6c2b4e6"} Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.798148 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" event={"ID":"4070aa48-04e2-4498-a16d-7f20675e55c7","Type":"ContainerStarted","Data":"104bed9ca84294d329c821cf2099745c933945b021491f99814470178c8a53fe"} Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.801911 4990 generic.go:334] "Generic (PLEG): container finished" podID="8ca5d372-6013-40e6-aee7-e541d60e99d1" containerID="7089fabbe814b57948a30f86da8fcbb1d7521c0fed1996aa6817540f2cd05cfb" exitCode=1 Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.803238 4990 scope.go:117] "RemoveContainer" containerID="7089fabbe814b57948a30f86da8fcbb1d7521c0fed1996aa6817540f2cd05cfb" Oct 03 11:30:01 crc kubenswrapper[4990]: E1003 11:30:01.803490 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7654d77b97-sxzhh_openstack(8ca5d372-6013-40e6-aee7-e541d60e99d1)\"" pod="openstack/heat-api-7654d77b97-sxzhh" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.803732 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7654d77b97-sxzhh" event={"ID":"8ca5d372-6013-40e6-aee7-e541d60e99d1","Type":"ContainerDied","Data":"7089fabbe814b57948a30f86da8fcbb1d7521c0fed1996aa6817540f2cd05cfb"} Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.817033 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7db7b4869c-gx756" podStartSLOduration=2.817011149 podStartE2EDuration="2.817011149s" podCreationTimestamp="2025-10-03 11:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:30:01.807138234 +0000 UTC m=+6383.603770111" watchObservedRunningTime="2025-10-03 11:30:01.817011149 +0000 UTC m=+6383.613643006" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.835429 4990 scope.go:117] "RemoveContainer" containerID="ed4323f5f871c0c46c431c4814e37c6d1b135f09c5b3f0ed9c82510cd1ae67d6" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.841791 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" podStartSLOduration=1.8417722269999999 podStartE2EDuration="1.841772227s" podCreationTimestamp="2025-10-03 11:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:30:01.83064413 +0000 UTC m=+6383.627276007" watchObservedRunningTime="2025-10-03 11:30:01.841772227 +0000 UTC m=+6383.638404084" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.869699 4990 scope.go:117] "RemoveContainer" containerID="0a2718100d2f28d540c310e4edfc61c6c47fec185b642b2ab229e620c0a4e291" Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.896567 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-67cc696c64-44ps6"] Oct 03 11:30:01 crc kubenswrapper[4990]: I1003 11:30:01.906420 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-67cc696c64-44ps6"] Oct 03 11:30:02 crc kubenswrapper[4990]: I1003 11:30:02.811900 4990 generic.go:334] "Generic (PLEG): container finished" podID="4070aa48-04e2-4498-a16d-7f20675e55c7" containerID="db2420606c169afcbcab781b7a7639f76f63b010145ab959711f2cd1a6c2b4e6" exitCode=0 Oct 03 11:30:02 crc kubenswrapper[4990]: I1003 11:30:02.812068 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" event={"ID":"4070aa48-04e2-4498-a16d-7f20675e55c7","Type":"ContainerDied","Data":"db2420606c169afcbcab781b7a7639f76f63b010145ab959711f2cd1a6c2b4e6"} Oct 03 11:30:02 crc kubenswrapper[4990]: I1003 11:30:02.815312 4990 scope.go:117] "RemoveContainer" containerID="7089fabbe814b57948a30f86da8fcbb1d7521c0fed1996aa6817540f2cd05cfb" Oct 03 11:30:02 crc kubenswrapper[4990]: E1003 11:30:02.815729 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7654d77b97-sxzhh_openstack(8ca5d372-6013-40e6-aee7-e541d60e99d1)\"" pod="openstack/heat-api-7654d77b97-sxzhh" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" Oct 03 11:30:02 crc kubenswrapper[4990]: I1003 11:30:02.817990 4990 scope.go:117] "RemoveContainer" containerID="c3f64c9a03fbcd81201cb1ae362fe7643cb881897685d3fc6936526352fb9109" Oct 03 11:30:02 crc kubenswrapper[4990]: E1003 11:30:02.818316 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6888547c6d-9pfmf_openstack(daca24d0-70d5-4adb-b77a-6b8499d7363e)\"" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" Oct 03 11:30:02 crc kubenswrapper[4990]: I1003 11:30:02.885625 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="951440f6-bb9c-4d4b-9f17-dcc280463405" path="/var/lib/kubelet/pods/951440f6-bb9c-4d4b-9f17-dcc280463405/volumes" Oct 03 11:30:02 crc kubenswrapper[4990]: I1003 11:30:02.886135 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c78899a2-aa4b-490f-a0b5-8b1de2c07012" path="/var/lib/kubelet/pods/c78899a2-aa4b-490f-a0b5-8b1de2c07012/volumes" Oct 03 11:30:03 crc kubenswrapper[4990]: I1003 11:30:03.376949 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:30:03 crc kubenswrapper[4990]: I1003 11:30:03.376997 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:30:03 crc kubenswrapper[4990]: I1003 11:30:03.387972 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:30:03 crc kubenswrapper[4990]: I1003 11:30:03.388097 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:30:03 crc kubenswrapper[4990]: I1003 11:30:03.827138 4990 scope.go:117] "RemoveContainer" containerID="c3f64c9a03fbcd81201cb1ae362fe7643cb881897685d3fc6936526352fb9109" Oct 03 11:30:03 crc kubenswrapper[4990]: I1003 11:30:03.827247 4990 scope.go:117] "RemoveContainer" containerID="7089fabbe814b57948a30f86da8fcbb1d7521c0fed1996aa6817540f2cd05cfb" Oct 03 11:30:03 crc kubenswrapper[4990]: E1003 11:30:03.827490 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6888547c6d-9pfmf_openstack(daca24d0-70d5-4adb-b77a-6b8499d7363e)\"" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" Oct 03 11:30:03 crc kubenswrapper[4990]: E1003 11:30:03.827700 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7654d77b97-sxzhh_openstack(8ca5d372-6013-40e6-aee7-e541d60e99d1)\"" pod="openstack/heat-api-7654d77b97-sxzhh" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.234586 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.389106 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwmcs\" (UniqueName: \"kubernetes.io/projected/4070aa48-04e2-4498-a16d-7f20675e55c7-kube-api-access-cwmcs\") pod \"4070aa48-04e2-4498-a16d-7f20675e55c7\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.389172 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4070aa48-04e2-4498-a16d-7f20675e55c7-secret-volume\") pod \"4070aa48-04e2-4498-a16d-7f20675e55c7\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.389397 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4070aa48-04e2-4498-a16d-7f20675e55c7-config-volume\") pod \"4070aa48-04e2-4498-a16d-7f20675e55c7\" (UID: \"4070aa48-04e2-4498-a16d-7f20675e55c7\") " Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.390121 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4070aa48-04e2-4498-a16d-7f20675e55c7-config-volume" (OuterVolumeSpecName: "config-volume") pod "4070aa48-04e2-4498-a16d-7f20675e55c7" (UID: "4070aa48-04e2-4498-a16d-7f20675e55c7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.394313 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4070aa48-04e2-4498-a16d-7f20675e55c7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4070aa48-04e2-4498-a16d-7f20675e55c7" (UID: "4070aa48-04e2-4498-a16d-7f20675e55c7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.394417 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4070aa48-04e2-4498-a16d-7f20675e55c7-kube-api-access-cwmcs" (OuterVolumeSpecName: "kube-api-access-cwmcs") pod "4070aa48-04e2-4498-a16d-7f20675e55c7" (UID: "4070aa48-04e2-4498-a16d-7f20675e55c7"). InnerVolumeSpecName "kube-api-access-cwmcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.492022 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4070aa48-04e2-4498-a16d-7f20675e55c7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.492065 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwmcs\" (UniqueName: \"kubernetes.io/projected/4070aa48-04e2-4498-a16d-7f20675e55c7-kube-api-access-cwmcs\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.492078 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4070aa48-04e2-4498-a16d-7f20675e55c7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.648682 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-c6c5fff94-4vqjd" podUID="a778e77c-5e84-4479-b318-2aa18673a832" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.130:8004/healthcheck\": read tcp 10.217.0.2:52460->10.217.1.130:8004: read: connection reset by peer" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.844502 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" event={"ID":"4070aa48-04e2-4498-a16d-7f20675e55c7","Type":"ContainerDied","Data":"104bed9ca84294d329c821cf2099745c933945b021491f99814470178c8a53fe"} Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.844829 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="104bed9ca84294d329c821cf2099745c933945b021491f99814470178c8a53fe" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.844563 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.846128 4990 generic.go:334] "Generic (PLEG): container finished" podID="a778e77c-5e84-4479-b318-2aa18673a832" containerID="c40d8fedb74587e1d6a32db26b287e1d1eb03fc3f575966e580af4a99eb4fd40" exitCode=0 Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.846946 4990 scope.go:117] "RemoveContainer" containerID="c3f64c9a03fbcd81201cb1ae362fe7643cb881897685d3fc6936526352fb9109" Oct 03 11:30:04 crc kubenswrapper[4990]: E1003 11:30:04.847329 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6888547c6d-9pfmf_openstack(daca24d0-70d5-4adb-b77a-6b8499d7363e)\"" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.847648 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-c6c5fff94-4vqjd" event={"ID":"a778e77c-5e84-4479-b318-2aa18673a832","Type":"ContainerDied","Data":"c40d8fedb74587e1d6a32db26b287e1d1eb03fc3f575966e580af4a99eb4fd40"} Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.902942 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482"] Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.907390 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324805-v4482"] Oct 03 11:30:04 crc kubenswrapper[4990]: I1003 11:30:04.979352 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.102376 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-combined-ca-bundle\") pod \"a778e77c-5e84-4479-b318-2aa18673a832\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.102533 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data\") pod \"a778e77c-5e84-4479-b318-2aa18673a832\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.102607 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzkgs\" (UniqueName: \"kubernetes.io/projected/a778e77c-5e84-4479-b318-2aa18673a832-kube-api-access-vzkgs\") pod \"a778e77c-5e84-4479-b318-2aa18673a832\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.102642 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data-custom\") pod \"a778e77c-5e84-4479-b318-2aa18673a832\" (UID: \"a778e77c-5e84-4479-b318-2aa18673a832\") " Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.106753 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a778e77c-5e84-4479-b318-2aa18673a832-kube-api-access-vzkgs" (OuterVolumeSpecName: "kube-api-access-vzkgs") pod "a778e77c-5e84-4479-b318-2aa18673a832" (UID: "a778e77c-5e84-4479-b318-2aa18673a832"). InnerVolumeSpecName "kube-api-access-vzkgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.107282 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a778e77c-5e84-4479-b318-2aa18673a832" (UID: "a778e77c-5e84-4479-b318-2aa18673a832"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.128692 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a778e77c-5e84-4479-b318-2aa18673a832" (UID: "a778e77c-5e84-4479-b318-2aa18673a832"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.171464 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data" (OuterVolumeSpecName: "config-data") pod "a778e77c-5e84-4479-b318-2aa18673a832" (UID: "a778e77c-5e84-4479-b318-2aa18673a832"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.205485 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.205539 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzkgs\" (UniqueName: \"kubernetes.io/projected/a778e77c-5e84-4479-b318-2aa18673a832-kube-api-access-vzkgs\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.205555 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.205568 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a778e77c-5e84-4479-b318-2aa18673a832-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.862241 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-c6c5fff94-4vqjd" event={"ID":"a778e77c-5e84-4479-b318-2aa18673a832","Type":"ContainerDied","Data":"cc371254e3145aff2ec98f780de45f72be0e51b91fe1886dac0510eb0ffe77f9"} Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.862619 4990 scope.go:117] "RemoveContainer" containerID="c40d8fedb74587e1d6a32db26b287e1d1eb03fc3f575966e580af4a99eb4fd40" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.862402 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-c6c5fff94-4vqjd" Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.905281 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-c6c5fff94-4vqjd"] Oct 03 11:30:05 crc kubenswrapper[4990]: I1003 11:30:05.915559 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-c6c5fff94-4vqjd"] Oct 03 11:30:06 crc kubenswrapper[4990]: I1003 11:30:06.887069 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56952dbd-209c-4fad-adfe-ea5dc0a0c349" path="/var/lib/kubelet/pods/56952dbd-209c-4fad-adfe-ea5dc0a0c349/volumes" Oct 03 11:30:06 crc kubenswrapper[4990]: I1003 11:30:06.889265 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a778e77c-5e84-4479-b318-2aa18673a832" path="/var/lib/kubelet/pods/a778e77c-5e84-4479-b318-2aa18673a832/volumes" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.041040 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-9b7c9fb-mfzpc" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.085452 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7db7b4869c-gx756" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.118984 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7654d77b97-sxzhh"] Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.201223 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.230455 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6888547c6d-9pfmf"] Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.433204 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-67cc696c64-44ps6" podUID="951440f6-bb9c-4d4b-9f17-dcc280463405" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.131:8000/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.636733 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.641284 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.647014 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c5fb6fc74-sw79s" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.122:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.122:8443: connect: connection refused" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.647136 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.763644 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqfx9\" (UniqueName: \"kubernetes.io/projected/8ca5d372-6013-40e6-aee7-e541d60e99d1-kube-api-access-fqfx9\") pod \"8ca5d372-6013-40e6-aee7-e541d60e99d1\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.763714 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-combined-ca-bundle\") pod \"daca24d0-70d5-4adb-b77a-6b8499d7363e\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.763788 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data\") pod \"8ca5d372-6013-40e6-aee7-e541d60e99d1\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.763850 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data-custom\") pod \"8ca5d372-6013-40e6-aee7-e541d60e99d1\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.763907 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgs29\" (UniqueName: \"kubernetes.io/projected/daca24d0-70d5-4adb-b77a-6b8499d7363e-kube-api-access-bgs29\") pod \"daca24d0-70d5-4adb-b77a-6b8499d7363e\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.763969 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data\") pod \"daca24d0-70d5-4adb-b77a-6b8499d7363e\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.764039 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data-custom\") pod \"daca24d0-70d5-4adb-b77a-6b8499d7363e\" (UID: \"daca24d0-70d5-4adb-b77a-6b8499d7363e\") " Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.764066 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-combined-ca-bundle\") pod \"8ca5d372-6013-40e6-aee7-e541d60e99d1\" (UID: \"8ca5d372-6013-40e6-aee7-e541d60e99d1\") " Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.769612 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daca24d0-70d5-4adb-b77a-6b8499d7363e-kube-api-access-bgs29" (OuterVolumeSpecName: "kube-api-access-bgs29") pod "daca24d0-70d5-4adb-b77a-6b8499d7363e" (UID: "daca24d0-70d5-4adb-b77a-6b8499d7363e"). InnerVolumeSpecName "kube-api-access-bgs29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.769672 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ca5d372-6013-40e6-aee7-e541d60e99d1" (UID: "8ca5d372-6013-40e6-aee7-e541d60e99d1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.770288 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca5d372-6013-40e6-aee7-e541d60e99d1-kube-api-access-fqfx9" (OuterVolumeSpecName: "kube-api-access-fqfx9") pod "8ca5d372-6013-40e6-aee7-e541d60e99d1" (UID: "8ca5d372-6013-40e6-aee7-e541d60e99d1"). InnerVolumeSpecName "kube-api-access-fqfx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.778767 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "daca24d0-70d5-4adb-b77a-6b8499d7363e" (UID: "daca24d0-70d5-4adb-b77a-6b8499d7363e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.794219 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "daca24d0-70d5-4adb-b77a-6b8499d7363e" (UID: "daca24d0-70d5-4adb-b77a-6b8499d7363e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.794857 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ca5d372-6013-40e6-aee7-e541d60e99d1" (UID: "8ca5d372-6013-40e6-aee7-e541d60e99d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.822812 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data" (OuterVolumeSpecName: "config-data") pod "daca24d0-70d5-4adb-b77a-6b8499d7363e" (UID: "daca24d0-70d5-4adb-b77a-6b8499d7363e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.828354 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data" (OuterVolumeSpecName: "config-data") pod "8ca5d372-6013-40e6-aee7-e541d60e99d1" (UID: "8ca5d372-6013-40e6-aee7-e541d60e99d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.866240 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.866271 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgs29\" (UniqueName: \"kubernetes.io/projected/daca24d0-70d5-4adb-b77a-6b8499d7363e-kube-api-access-bgs29\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.866283 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.866292 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.866299 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.866308 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqfx9\" (UniqueName: \"kubernetes.io/projected/8ca5d372-6013-40e6-aee7-e541d60e99d1-kube-api-access-fqfx9\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.866317 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daca24d0-70d5-4adb-b77a-6b8499d7363e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.866325 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca5d372-6013-40e6-aee7-e541d60e99d1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.933652 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" event={"ID":"daca24d0-70d5-4adb-b77a-6b8499d7363e","Type":"ContainerDied","Data":"65edc8ccd189b93826bd23eaa6b7373376427152e2be854b81c927d2dfeea9ab"} Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.933706 4990 scope.go:117] "RemoveContainer" containerID="c3f64c9a03fbcd81201cb1ae362fe7643cb881897685d3fc6936526352fb9109" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.933666 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6888547c6d-9pfmf" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.935087 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7654d77b97-sxzhh" event={"ID":"8ca5d372-6013-40e6-aee7-e541d60e99d1","Type":"ContainerDied","Data":"c438e65c786e01b05c11585fa969620327f4dd21b692710d538af4e79a6e9e8e"} Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.935136 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7654d77b97-sxzhh" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.975698 4990 scope.go:117] "RemoveContainer" containerID="7089fabbe814b57948a30f86da8fcbb1d7521c0fed1996aa6817540f2cd05cfb" Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.977593 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7654d77b97-sxzhh"] Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.986951 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7654d77b97-sxzhh"] Oct 03 11:30:11 crc kubenswrapper[4990]: I1003 11:30:11.994973 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6888547c6d-9pfmf"] Oct 03 11:30:12 crc kubenswrapper[4990]: I1003 11:30:12.002754 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6888547c6d-9pfmf"] Oct 03 11:30:12 crc kubenswrapper[4990]: I1003 11:30:12.027361 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8080-account-create-5kpjw"] Oct 03 11:30:12 crc kubenswrapper[4990]: I1003 11:30:12.037243 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8080-account-create-5kpjw"] Oct 03 11:30:12 crc kubenswrapper[4990]: I1003 11:30:12.882250 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6289514c-49de-491b-8f6a-f2f300f4ec76" path="/var/lib/kubelet/pods/6289514c-49de-491b-8f6a-f2f300f4ec76/volumes" Oct 03 11:30:12 crc kubenswrapper[4990]: I1003 11:30:12.883118 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" path="/var/lib/kubelet/pods/8ca5d372-6013-40e6-aee7-e541d60e99d1/volumes" Oct 03 11:30:12 crc kubenswrapper[4990]: I1003 11:30:12.883659 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" path="/var/lib/kubelet/pods/daca24d0-70d5-4adb-b77a-6b8499d7363e/volumes" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.410862 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.524167 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-config-data\") pod \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.524273 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-tls-certs\") pod \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.524352 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjsgq\" (UniqueName: \"kubernetes.io/projected/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-kube-api-access-rjsgq\") pod \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.524685 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-combined-ca-bundle\") pod \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.524761 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-logs\") pod \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.524804 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-secret-key\") pod \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.524834 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-scripts\") pod \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\" (UID: \"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d\") " Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.525353 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-logs" (OuterVolumeSpecName: "logs") pod "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" (UID: "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.525893 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.529663 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-kube-api-access-rjsgq" (OuterVolumeSpecName: "kube-api-access-rjsgq") pod "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" (UID: "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d"). InnerVolumeSpecName "kube-api-access-rjsgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.530410 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" (UID: "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.553360 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-config-data" (OuterVolumeSpecName: "config-data") pod "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" (UID: "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.557422 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-scripts" (OuterVolumeSpecName: "scripts") pod "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" (UID: "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.559658 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" (UID: "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.577498 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" (UID: "f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.628534 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.628570 4990 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.628582 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjsgq\" (UniqueName: \"kubernetes.io/projected/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-kube-api-access-rjsgq\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.628591 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.628600 4990 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.628608 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.980076 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c5fb6fc74-sw79s" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.980397 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fb6fc74-sw79s" event={"ID":"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d","Type":"ContainerDied","Data":"9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98"} Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.981511 4990 scope.go:117] "RemoveContainer" containerID="e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed" Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.980423 4990 generic.go:334] "Generic (PLEG): container finished" podID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerID="9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98" exitCode=137 Oct 03 11:30:14 crc kubenswrapper[4990]: I1003 11:30:14.981616 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c5fb6fc74-sw79s" event={"ID":"f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d","Type":"ContainerDied","Data":"a43b04250082e619513d75d2e0bc2037466e7f98179cf728b1537124f580cb06"} Oct 03 11:30:15 crc kubenswrapper[4990]: I1003 11:30:15.017687 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c5fb6fc74-sw79s"] Oct 03 11:30:15 crc kubenswrapper[4990]: I1003 11:30:15.028969 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c5fb6fc74-sw79s"] Oct 03 11:30:15 crc kubenswrapper[4990]: I1003 11:30:15.196099 4990 scope.go:117] "RemoveContainer" containerID="9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98" Oct 03 11:30:15 crc kubenswrapper[4990]: I1003 11:30:15.234459 4990 scope.go:117] "RemoveContainer" containerID="e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed" Oct 03 11:30:15 crc kubenswrapper[4990]: E1003 11:30:15.235205 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed\": container with ID starting with e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed not found: ID does not exist" containerID="e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed" Oct 03 11:30:15 crc kubenswrapper[4990]: I1003 11:30:15.235268 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed"} err="failed to get container status \"e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed\": rpc error: code = NotFound desc = could not find container \"e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed\": container with ID starting with e8b49e48c39890d9fdeb7f4c91c803a6fc1a23055fce6649db56af640684beed not found: ID does not exist" Oct 03 11:30:15 crc kubenswrapper[4990]: I1003 11:30:15.235306 4990 scope.go:117] "RemoveContainer" containerID="9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98" Oct 03 11:30:15 crc kubenswrapper[4990]: E1003 11:30:15.235902 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98\": container with ID starting with 9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98 not found: ID does not exist" containerID="9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98" Oct 03 11:30:15 crc kubenswrapper[4990]: I1003 11:30:15.235971 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98"} err="failed to get container status \"9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98\": rpc error: code = NotFound desc = could not find container \"9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98\": container with ID starting with 9781971228dfd253a7f08148af3c297749e93a63356872928559046194cd8e98 not found: ID does not exist" Oct 03 11:30:16 crc kubenswrapper[4990]: I1003 11:30:16.894054 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" path="/var/lib/kubelet/pods/f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d/volumes" Oct 03 11:30:18 crc kubenswrapper[4990]: I1003 11:30:18.336033 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6c667dbb67-dlps2" Oct 03 11:30:18 crc kubenswrapper[4990]: I1003 11:30:18.392659 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6bc484dfb4-kr56g"] Oct 03 11:30:18 crc kubenswrapper[4990]: I1003 11:30:18.392903 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6bc484dfb4-kr56g" podUID="0f8e9d9f-6e68-46bc-bf9f-fc536507111b" containerName="heat-engine" containerID="cri-o://6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea" gracePeriod=60 Oct 03 11:30:20 crc kubenswrapper[4990]: I1003 11:30:20.033862 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-w97ck"] Oct 03 11:30:20 crc kubenswrapper[4990]: I1003 11:30:20.048890 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-w97ck"] Oct 03 11:30:20 crc kubenswrapper[4990]: I1003 11:30:20.888774 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6defce53-beca-45b7-9450-398c8ee12108" path="/var/lib/kubelet/pods/6defce53-beca-45b7-9450-398c8ee12108/volumes" Oct 03 11:30:21 crc kubenswrapper[4990]: E1003 11:30:21.069635 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 11:30:21 crc kubenswrapper[4990]: E1003 11:30:21.071154 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 11:30:21 crc kubenswrapper[4990]: E1003 11:30:21.073365 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 11:30:21 crc kubenswrapper[4990]: E1003 11:30:21.073634 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6bc484dfb4-kr56g" podUID="0f8e9d9f-6e68-46bc-bf9f-fc536507111b" containerName="heat-engine" Oct 03 11:30:25 crc kubenswrapper[4990]: I1003 11:30:25.304480 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:30:25 crc kubenswrapper[4990]: I1003 11:30:25.305243 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:30:25 crc kubenswrapper[4990]: I1003 11:30:25.305312 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:30:25 crc kubenswrapper[4990]: I1003 11:30:25.306543 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:30:25 crc kubenswrapper[4990]: I1003 11:30:25.306809 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" gracePeriod=600 Oct 03 11:30:25 crc kubenswrapper[4990]: E1003 11:30:25.441062 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:30:26 crc kubenswrapper[4990]: I1003 11:30:26.120857 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" exitCode=0 Oct 03 11:30:26 crc kubenswrapper[4990]: I1003 11:30:26.120932 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1"} Oct 03 11:30:26 crc kubenswrapper[4990]: I1003 11:30:26.121000 4990 scope.go:117] "RemoveContainer" containerID="ede822eb416de1cd2298678efb3de56140b965041f9c6d5dad16b6ae484057a3" Oct 03 11:30:26 crc kubenswrapper[4990]: I1003 11:30:26.121801 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:30:26 crc kubenswrapper[4990]: E1003 11:30:26.122399 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.560420 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.708098 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data-custom\") pod \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.708584 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k6cg\" (UniqueName: \"kubernetes.io/projected/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-kube-api-access-7k6cg\") pod \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.708775 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-combined-ca-bundle\") pod \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.708810 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data\") pod \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\" (UID: \"0f8e9d9f-6e68-46bc-bf9f-fc536507111b\") " Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.714302 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-kube-api-access-7k6cg" (OuterVolumeSpecName: "kube-api-access-7k6cg") pod "0f8e9d9f-6e68-46bc-bf9f-fc536507111b" (UID: "0f8e9d9f-6e68-46bc-bf9f-fc536507111b"). InnerVolumeSpecName "kube-api-access-7k6cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.723025 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f8e9d9f-6e68-46bc-bf9f-fc536507111b" (UID: "0f8e9d9f-6e68-46bc-bf9f-fc536507111b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.754007 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f8e9d9f-6e68-46bc-bf9f-fc536507111b" (UID: "0f8e9d9f-6e68-46bc-bf9f-fc536507111b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.766477 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data" (OuterVolumeSpecName: "config-data") pod "0f8e9d9f-6e68-46bc-bf9f-fc536507111b" (UID: "0f8e9d9f-6e68-46bc-bf9f-fc536507111b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.811173 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k6cg\" (UniqueName: \"kubernetes.io/projected/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-kube-api-access-7k6cg\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.811206 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.811223 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:30 crc kubenswrapper[4990]: I1003 11:30:30.811234 4990 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f8e9d9f-6e68-46bc-bf9f-fc536507111b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.189194 4990 generic.go:334] "Generic (PLEG): container finished" podID="0f8e9d9f-6e68-46bc-bf9f-fc536507111b" containerID="6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea" exitCode=0 Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.189249 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bc484dfb4-kr56g" event={"ID":"0f8e9d9f-6e68-46bc-bf9f-fc536507111b","Type":"ContainerDied","Data":"6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea"} Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.189448 4990 scope.go:117] "RemoveContainer" containerID="6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea" Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.189582 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bc484dfb4-kr56g" event={"ID":"0f8e9d9f-6e68-46bc-bf9f-fc536507111b","Type":"ContainerDied","Data":"583bd78a6f07e51c38e439d3bcd1027ab1539b2e96a5cf1708d2991d7d556b34"} Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.189953 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bc484dfb4-kr56g" Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.224777 4990 scope.go:117] "RemoveContainer" containerID="6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea" Oct 03 11:30:31 crc kubenswrapper[4990]: E1003 11:30:31.225358 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea\": container with ID starting with 6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea not found: ID does not exist" containerID="6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea" Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.225403 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea"} err="failed to get container status \"6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea\": rpc error: code = NotFound desc = could not find container \"6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea\": container with ID starting with 6ee9fcdfcc9d818978679daa0611411d7bd017d6b9e08eef847a3fb64fc3eaea not found: ID does not exist" Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.226112 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6bc484dfb4-kr56g"] Oct 03 11:30:31 crc kubenswrapper[4990]: I1003 11:30:31.236280 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6bc484dfb4-kr56g"] Oct 03 11:30:32 crc kubenswrapper[4990]: I1003 11:30:32.884257 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f8e9d9f-6e68-46bc-bf9f-fc536507111b" path="/var/lib/kubelet/pods/0f8e9d9f-6e68-46bc-bf9f-fc536507111b/volumes" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.219068 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9"] Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220087 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220103 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220120 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951440f6-bb9c-4d4b-9f17-dcc280463405" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220129 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="951440f6-bb9c-4d4b-9f17-dcc280463405" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220138 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220146 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220160 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220168 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220182 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon-log" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220189 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon-log" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220213 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a778e77c-5e84-4479-b318-2aa18673a832" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220220 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a778e77c-5e84-4479-b318-2aa18673a832" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220242 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f8e9d9f-6e68-46bc-bf9f-fc536507111b" containerName="heat-engine" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220250 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f8e9d9f-6e68-46bc-bf9f-fc536507111b" containerName="heat-engine" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220272 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4070aa48-04e2-4498-a16d-7f20675e55c7" containerName="collect-profiles" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220279 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4070aa48-04e2-4498-a16d-7f20675e55c7" containerName="collect-profiles" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220290 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220297 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220532 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a778e77c-5e84-4479-b318-2aa18673a832" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220555 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4070aa48-04e2-4498-a16d-7f20675e55c7" containerName="collect-profiles" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220569 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220583 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220593 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220603 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="daca24d0-70d5-4adb-b77a-6b8499d7363e" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220618 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220634 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08ca166-3f80-40f9-9c8d-d4bf3ca2df9d" containerName="horizon-log" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220644 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f8e9d9f-6e68-46bc-bf9f-fc536507111b" containerName="heat-engine" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220659 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="951440f6-bb9c-4d4b-9f17-dcc280463405" containerName="heat-cfnapi" Oct 03 11:30:37 crc kubenswrapper[4990]: E1003 11:30:37.220863 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.220873 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca5d372-6013-40e6-aee7-e541d60e99d1" containerName="heat-api" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.222121 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.225763 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.232300 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9"] Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.351133 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.351234 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzp8h\" (UniqueName: \"kubernetes.io/projected/d0891d41-d167-40c2-a9a0-5a44a84f8a71-kube-api-access-lzp8h\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.351562 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.453501 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.453861 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzp8h\" (UniqueName: \"kubernetes.io/projected/d0891d41-d167-40c2-a9a0-5a44a84f8a71-kube-api-access-lzp8h\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.454441 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.454176 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.454873 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.477240 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzp8h\" (UniqueName: \"kubernetes.io/projected/d0891d41-d167-40c2-a9a0-5a44a84f8a71-kube-api-access-lzp8h\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:37 crc kubenswrapper[4990]: I1003 11:30:37.553398 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:38 crc kubenswrapper[4990]: I1003 11:30:38.067637 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9"] Oct 03 11:30:38 crc kubenswrapper[4990]: I1003 11:30:38.260391 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" event={"ID":"d0891d41-d167-40c2-a9a0-5a44a84f8a71","Type":"ContainerStarted","Data":"f1f12bffe98e902b3b66ab694f3ba39f7c933c50a5de98d27c6ad9454b165b4e"} Oct 03 11:30:38 crc kubenswrapper[4990]: I1003 11:30:38.881982 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:30:38 crc kubenswrapper[4990]: E1003 11:30:38.882986 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:30:39 crc kubenswrapper[4990]: I1003 11:30:39.272213 4990 generic.go:334] "Generic (PLEG): container finished" podID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerID="b73a370947cfcfdf81a21e08fd950cfef8d1f4a1c2b572de51cd702dbf9938e1" exitCode=0 Oct 03 11:30:39 crc kubenswrapper[4990]: I1003 11:30:39.272254 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" event={"ID":"d0891d41-d167-40c2-a9a0-5a44a84f8a71","Type":"ContainerDied","Data":"b73a370947cfcfdf81a21e08fd950cfef8d1f4a1c2b572de51cd702dbf9938e1"} Oct 03 11:30:39 crc kubenswrapper[4990]: I1003 11:30:39.274287 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 11:30:41 crc kubenswrapper[4990]: I1003 11:30:41.298265 4990 generic.go:334] "Generic (PLEG): container finished" podID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerID="1e2cba9c7e0c48aea0dc2374b60470181ef98195567ba74783d104e600143ec9" exitCode=0 Oct 03 11:30:41 crc kubenswrapper[4990]: I1003 11:30:41.298531 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" event={"ID":"d0891d41-d167-40c2-a9a0-5a44a84f8a71","Type":"ContainerDied","Data":"1e2cba9c7e0c48aea0dc2374b60470181ef98195567ba74783d104e600143ec9"} Oct 03 11:30:42 crc kubenswrapper[4990]: I1003 11:30:42.313084 4990 generic.go:334] "Generic (PLEG): container finished" podID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerID="643a02a751b3165413c99bcbeb6821a9a347b6bb2d9c82411e24158b0ccdf08e" exitCode=0 Oct 03 11:30:42 crc kubenswrapper[4990]: I1003 11:30:42.313152 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" event={"ID":"d0891d41-d167-40c2-a9a0-5a44a84f8a71","Type":"ContainerDied","Data":"643a02a751b3165413c99bcbeb6821a9a347b6bb2d9c82411e24158b0ccdf08e"} Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.672245 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.810640 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-bundle\") pod \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.810803 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-util\") pod \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.810862 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzp8h\" (UniqueName: \"kubernetes.io/projected/d0891d41-d167-40c2-a9a0-5a44a84f8a71-kube-api-access-lzp8h\") pod \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\" (UID: \"d0891d41-d167-40c2-a9a0-5a44a84f8a71\") " Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.813359 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-bundle" (OuterVolumeSpecName: "bundle") pod "d0891d41-d167-40c2-a9a0-5a44a84f8a71" (UID: "d0891d41-d167-40c2-a9a0-5a44a84f8a71"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.818619 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0891d41-d167-40c2-a9a0-5a44a84f8a71-kube-api-access-lzp8h" (OuterVolumeSpecName: "kube-api-access-lzp8h") pod "d0891d41-d167-40c2-a9a0-5a44a84f8a71" (UID: "d0891d41-d167-40c2-a9a0-5a44a84f8a71"). InnerVolumeSpecName "kube-api-access-lzp8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.821139 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-util" (OuterVolumeSpecName: "util") pod "d0891d41-d167-40c2-a9a0-5a44a84f8a71" (UID: "d0891d41-d167-40c2-a9a0-5a44a84f8a71"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.913353 4990 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-util\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.913396 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzp8h\" (UniqueName: \"kubernetes.io/projected/d0891d41-d167-40c2-a9a0-5a44a84f8a71-kube-api-access-lzp8h\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:43 crc kubenswrapper[4990]: I1003 11:30:43.913409 4990 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0891d41-d167-40c2-a9a0-5a44a84f8a71-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:30:44 crc kubenswrapper[4990]: I1003 11:30:44.341627 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" event={"ID":"d0891d41-d167-40c2-a9a0-5a44a84f8a71","Type":"ContainerDied","Data":"f1f12bffe98e902b3b66ab694f3ba39f7c933c50a5de98d27c6ad9454b165b4e"} Oct 03 11:30:44 crc kubenswrapper[4990]: I1003 11:30:44.341670 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1f12bffe98e902b3b66ab694f3ba39f7c933c50a5de98d27c6ad9454b165b4e" Oct 03 11:30:44 crc kubenswrapper[4990]: I1003 11:30:44.341742 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9" Oct 03 11:30:49 crc kubenswrapper[4990]: I1003 11:30:49.071369 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kfp5z"] Oct 03 11:30:49 crc kubenswrapper[4990]: I1003 11:30:49.101338 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kfp5z"] Oct 03 11:30:50 crc kubenswrapper[4990]: I1003 11:30:50.850502 4990 scope.go:117] "RemoveContainer" containerID="b28df4a58f97f2ffcc59aa61b5b4f893608704a164f93fc76d33c812bef6d2ab" Oct 03 11:30:50 crc kubenswrapper[4990]: I1003 11:30:50.876723 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:30:50 crc kubenswrapper[4990]: E1003 11:30:50.876996 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:30:50 crc kubenswrapper[4990]: I1003 11:30:50.887557 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f624aa60-521f-442b-a15d-f7eb21df31e7" path="/var/lib/kubelet/pods/f624aa60-521f-442b-a15d-f7eb21df31e7/volumes" Oct 03 11:30:50 crc kubenswrapper[4990]: I1003 11:30:50.899861 4990 scope.go:117] "RemoveContainer" containerID="94576bebfbb3aafab588bc36e3e6ee0e45cf7dee92250bd249af2da3c3cd0617" Oct 03 11:30:50 crc kubenswrapper[4990]: I1003 11:30:50.925415 4990 scope.go:117] "RemoveContainer" containerID="02a10f97eb136ec85f01e171348a05a0e8c51ab086a5c244e78e21921faeff03" Oct 03 11:30:50 crc kubenswrapper[4990]: I1003 11:30:50.994604 4990 scope.go:117] "RemoveContainer" containerID="2c231615a596a195069ee92680a0d4521e42d4e4a241234cf57b0ccdfe878fcc" Oct 03 11:30:51 crc kubenswrapper[4990]: I1003 11:30:51.036445 4990 scope.go:117] "RemoveContainer" containerID="43ad5333a9497a376d4980a650d1e111dd599e0ce8d1a9e8136b8c09a8fcc1b1" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.542099 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6"] Oct 03 11:30:53 crc kubenswrapper[4990]: E1003 11:30:53.543036 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerName="pull" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.543050 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerName="pull" Oct 03 11:30:53 crc kubenswrapper[4990]: E1003 11:30:53.543077 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerName="util" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.543085 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerName="util" Oct 03 11:30:53 crc kubenswrapper[4990]: E1003 11:30:53.543095 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerName="extract" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.543101 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerName="extract" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.543280 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0891d41-d167-40c2-a9a0-5a44a84f8a71" containerName="extract" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.543966 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.549404 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.549573 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.549687 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-qv8hr" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.553953 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6"] Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.639824 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6n4v\" (UniqueName: \"kubernetes.io/projected/14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b-kube-api-access-d6n4v\") pod \"obo-prometheus-operator-7c8cf85677-kn4p6\" (UID: \"14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.662158 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn"] Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.665599 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.669293 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.669481 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-hj6z4" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.673317 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7"] Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.675289 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.688347 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn"] Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.718673 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7"] Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.741622 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1db65103-4b9c-4ef5-8d8c-f12e4c697871-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn\" (UID: \"1db65103-4b9c-4ef5-8d8c-f12e4c697871\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.741698 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6n4v\" (UniqueName: \"kubernetes.io/projected/14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b-kube-api-access-d6n4v\") pod \"obo-prometheus-operator-7c8cf85677-kn4p6\" (UID: \"14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.742524 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1db65103-4b9c-4ef5-8d8c-f12e4c697871-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn\" (UID: \"1db65103-4b9c-4ef5-8d8c-f12e4c697871\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.742589 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03fd91e9-b108-4b44-8ded-67dc4bc47e98-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7\" (UID: \"03fd91e9-b108-4b44-8ded-67dc4bc47e98\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.742712 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03fd91e9-b108-4b44-8ded-67dc4bc47e98-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7\" (UID: \"03fd91e9-b108-4b44-8ded-67dc4bc47e98\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.779880 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6n4v\" (UniqueName: \"kubernetes.io/projected/14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b-kube-api-access-d6n4v\") pod \"obo-prometheus-operator-7c8cf85677-kn4p6\" (UID: \"14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.844824 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1db65103-4b9c-4ef5-8d8c-f12e4c697871-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn\" (UID: \"1db65103-4b9c-4ef5-8d8c-f12e4c697871\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.844898 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03fd91e9-b108-4b44-8ded-67dc4bc47e98-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7\" (UID: \"03fd91e9-b108-4b44-8ded-67dc4bc47e98\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.844998 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03fd91e9-b108-4b44-8ded-67dc4bc47e98-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7\" (UID: \"03fd91e9-b108-4b44-8ded-67dc4bc47e98\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.845098 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1db65103-4b9c-4ef5-8d8c-f12e4c697871-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn\" (UID: \"1db65103-4b9c-4ef5-8d8c-f12e4c697871\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.849173 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1db65103-4b9c-4ef5-8d8c-f12e4c697871-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn\" (UID: \"1db65103-4b9c-4ef5-8d8c-f12e4c697871\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.849725 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/03fd91e9-b108-4b44-8ded-67dc4bc47e98-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7\" (UID: \"03fd91e9-b108-4b44-8ded-67dc4bc47e98\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.857032 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1db65103-4b9c-4ef5-8d8c-f12e4c697871-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn\" (UID: \"1db65103-4b9c-4ef5-8d8c-f12e4c697871\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.860095 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/03fd91e9-b108-4b44-8ded-67dc4bc47e98-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7\" (UID: \"03fd91e9-b108-4b44-8ded-67dc4bc47e98\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.864131 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.868565 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-62fwf"] Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.870380 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.874383 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-g7d9s" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.874606 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.911266 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-62fwf"] Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.947017 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd76c\" (UniqueName: \"kubernetes.io/projected/ecd65876-60c2-45ca-84e9-51a9acb8b6e8-kube-api-access-fd76c\") pod \"observability-operator-cc5f78dfc-62fwf\" (UID: \"ecd65876-60c2-45ca-84e9-51a9acb8b6e8\") " pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.948423 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd65876-60c2-45ca-84e9-51a9acb8b6e8-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-62fwf\" (UID: \"ecd65876-60c2-45ca-84e9-51a9acb8b6e8\") " pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:30:53 crc kubenswrapper[4990]: I1003 11:30:53.992636 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.020205 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.050008 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd65876-60c2-45ca-84e9-51a9acb8b6e8-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-62fwf\" (UID: \"ecd65876-60c2-45ca-84e9-51a9acb8b6e8\") " pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.050152 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd76c\" (UniqueName: \"kubernetes.io/projected/ecd65876-60c2-45ca-84e9-51a9acb8b6e8-kube-api-access-fd76c\") pod \"observability-operator-cc5f78dfc-62fwf\" (UID: \"ecd65876-60c2-45ca-84e9-51a9acb8b6e8\") " pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.057307 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd65876-60c2-45ca-84e9-51a9acb8b6e8-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-62fwf\" (UID: \"ecd65876-60c2-45ca-84e9-51a9acb8b6e8\") " pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.103648 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-68df8"] Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.105278 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.109985 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-7fls5" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.116199 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd76c\" (UniqueName: \"kubernetes.io/projected/ecd65876-60c2-45ca-84e9-51a9acb8b6e8-kube-api-access-fd76c\") pod \"observability-operator-cc5f78dfc-62fwf\" (UID: \"ecd65876-60c2-45ca-84e9-51a9acb8b6e8\") " pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.137303 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-68df8"] Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.257190 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b61f14d1-ad23-48c6-a6a3-bb45373e5128-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-68df8\" (UID: \"b61f14d1-ad23-48c6-a6a3-bb45373e5128\") " pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.257579 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rshmw\" (UniqueName: \"kubernetes.io/projected/b61f14d1-ad23-48c6-a6a3-bb45373e5128-kube-api-access-rshmw\") pod \"perses-operator-54bc95c9fb-68df8\" (UID: \"b61f14d1-ad23-48c6-a6a3-bb45373e5128\") " pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.310245 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.366597 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rshmw\" (UniqueName: \"kubernetes.io/projected/b61f14d1-ad23-48c6-a6a3-bb45373e5128-kube-api-access-rshmw\") pod \"perses-operator-54bc95c9fb-68df8\" (UID: \"b61f14d1-ad23-48c6-a6a3-bb45373e5128\") " pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.366702 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b61f14d1-ad23-48c6-a6a3-bb45373e5128-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-68df8\" (UID: \"b61f14d1-ad23-48c6-a6a3-bb45373e5128\") " pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.367686 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/b61f14d1-ad23-48c6-a6a3-bb45373e5128-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-68df8\" (UID: \"b61f14d1-ad23-48c6-a6a3-bb45373e5128\") " pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.391275 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rshmw\" (UniqueName: \"kubernetes.io/projected/b61f14d1-ad23-48c6-a6a3-bb45373e5128-kube-api-access-rshmw\") pod \"perses-operator-54bc95c9fb-68df8\" (UID: \"b61f14d1-ad23-48c6-a6a3-bb45373e5128\") " pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.492582 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6"] Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.532018 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.837025 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn"] Oct 03 11:30:54 crc kubenswrapper[4990]: W1003 11:30:54.843748 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1db65103_4b9c_4ef5_8d8c_f12e4c697871.slice/crio-545310243d57aded56a80bbc635fa0adbc58bf65937606ade37d870209f0e203 WatchSource:0}: Error finding container 545310243d57aded56a80bbc635fa0adbc58bf65937606ade37d870209f0e203: Status 404 returned error can't find the container with id 545310243d57aded56a80bbc635fa0adbc58bf65937606ade37d870209f0e203 Oct 03 11:30:54 crc kubenswrapper[4990]: W1003 11:30:54.955273 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03fd91e9_b108_4b44_8ded_67dc4bc47e98.slice/crio-01adc79caf98e784e7feed20e17b77a6ce2247a0062615294098968806a6fd81 WatchSource:0}: Error finding container 01adc79caf98e784e7feed20e17b77a6ce2247a0062615294098968806a6fd81: Status 404 returned error can't find the container with id 01adc79caf98e784e7feed20e17b77a6ce2247a0062615294098968806a6fd81 Oct 03 11:30:54 crc kubenswrapper[4990]: I1003 11:30:54.979138 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7"] Oct 03 11:30:55 crc kubenswrapper[4990]: W1003 11:30:55.032564 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd65876_60c2_45ca_84e9_51a9acb8b6e8.slice/crio-0a3ca94c9a16d1048fc008e4e93d7edf7c2ae00f0d136155e5567f952ecc4d1c WatchSource:0}: Error finding container 0a3ca94c9a16d1048fc008e4e93d7edf7c2ae00f0d136155e5567f952ecc4d1c: Status 404 returned error can't find the container with id 0a3ca94c9a16d1048fc008e4e93d7edf7c2ae00f0d136155e5567f952ecc4d1c Oct 03 11:30:55 crc kubenswrapper[4990]: I1003 11:30:55.034801 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-62fwf"] Oct 03 11:30:55 crc kubenswrapper[4990]: I1003 11:30:55.214964 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-68df8"] Oct 03 11:30:55 crc kubenswrapper[4990]: W1003 11:30:55.227627 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61f14d1_ad23_48c6_a6a3_bb45373e5128.slice/crio-3c5429c43876fded410861668dcf48d68711113d2ce2dfc76587c78447617284 WatchSource:0}: Error finding container 3c5429c43876fded410861668dcf48d68711113d2ce2dfc76587c78447617284: Status 404 returned error can't find the container with id 3c5429c43876fded410861668dcf48d68711113d2ce2dfc76587c78447617284 Oct 03 11:30:55 crc kubenswrapper[4990]: I1003 11:30:55.477907 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" event={"ID":"03fd91e9-b108-4b44-8ded-67dc4bc47e98","Type":"ContainerStarted","Data":"01adc79caf98e784e7feed20e17b77a6ce2247a0062615294098968806a6fd81"} Oct 03 11:30:55 crc kubenswrapper[4990]: I1003 11:30:55.480629 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6" event={"ID":"14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b","Type":"ContainerStarted","Data":"6b9e8772af61d43f58db236368611cef8ef4c868c919ba5e13decd5c7e958a9a"} Oct 03 11:30:55 crc kubenswrapper[4990]: I1003 11:30:55.483174 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" event={"ID":"ecd65876-60c2-45ca-84e9-51a9acb8b6e8","Type":"ContainerStarted","Data":"0a3ca94c9a16d1048fc008e4e93d7edf7c2ae00f0d136155e5567f952ecc4d1c"} Oct 03 11:30:55 crc kubenswrapper[4990]: I1003 11:30:55.490716 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-68df8" event={"ID":"b61f14d1-ad23-48c6-a6a3-bb45373e5128","Type":"ContainerStarted","Data":"3c5429c43876fded410861668dcf48d68711113d2ce2dfc76587c78447617284"} Oct 03 11:30:55 crc kubenswrapper[4990]: I1003 11:30:55.498628 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" event={"ID":"1db65103-4b9c-4ef5-8d8c-f12e4c697871","Type":"ContainerStarted","Data":"545310243d57aded56a80bbc635fa0adbc58bf65937606ade37d870209f0e203"} Oct 03 11:30:59 crc kubenswrapper[4990]: I1003 11:30:59.086573 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fe13-account-create-59jt8"] Oct 03 11:30:59 crc kubenswrapper[4990]: I1003 11:30:59.108390 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fe13-account-create-59jt8"] Oct 03 11:31:00 crc kubenswrapper[4990]: I1003 11:31:00.890068 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00497aea-4a18-4550-aea4-94c333754563" path="/var/lib/kubelet/pods/00497aea-4a18-4550-aea4-94c333754563/volumes" Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.610569 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" event={"ID":"1db65103-4b9c-4ef5-8d8c-f12e4c697871","Type":"ContainerStarted","Data":"9287a96bf0b0b7a84049bd1403986cfdcbc7ed604b81da25c90c07a221209002"} Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.611869 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" event={"ID":"03fd91e9-b108-4b44-8ded-67dc4bc47e98","Type":"ContainerStarted","Data":"b7a22aefa25b96cccc465524ea6aa66f5c1458ff848014d944b2569f24cda0fc"} Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.614270 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6" event={"ID":"14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b","Type":"ContainerStarted","Data":"890173b2c8b96d40fdf47ae7b7d2a63965e465c1c9fac6f4735720112bb45d2c"} Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.615900 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" event={"ID":"ecd65876-60c2-45ca-84e9-51a9acb8b6e8","Type":"ContainerStarted","Data":"3575facdcc9bc7eccb90a610768320a4307bdf49905e4c38bf9a37dbfa5b348a"} Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.616127 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.617671 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-68df8" event={"ID":"b61f14d1-ad23-48c6-a6a3-bb45373e5128","Type":"ContainerStarted","Data":"2c6891296b9237f37a9b5b88d63b71be2a9d15e1e5b7ad45d42123df0dd1bbe4"} Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.617751 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.637096 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn" podStartSLOduration=3.083364575 podStartE2EDuration="10.637077906s" podCreationTimestamp="2025-10-03 11:30:53 +0000 UTC" firstStartedPulling="2025-10-03 11:30:54.846257677 +0000 UTC m=+6436.642889534" lastFinishedPulling="2025-10-03 11:31:02.399971008 +0000 UTC m=+6444.196602865" observedRunningTime="2025-10-03 11:31:03.635084514 +0000 UTC m=+6445.431716391" watchObservedRunningTime="2025-10-03 11:31:03.637077906 +0000 UTC m=+6445.433709763" Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.663483 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7" podStartSLOduration=3.224470501 podStartE2EDuration="10.663464326s" podCreationTimestamp="2025-10-03 11:30:53 +0000 UTC" firstStartedPulling="2025-10-03 11:30:54.961101366 +0000 UTC m=+6436.757733233" lastFinishedPulling="2025-10-03 11:31:02.400095201 +0000 UTC m=+6444.196727058" observedRunningTime="2025-10-03 11:31:03.652815211 +0000 UTC m=+6445.449447188" watchObservedRunningTime="2025-10-03 11:31:03.663464326 +0000 UTC m=+6445.460096183" Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.670683 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.693099 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kn4p6" podStartSLOduration=2.800292943 podStartE2EDuration="10.693079828s" podCreationTimestamp="2025-10-03 11:30:53 +0000 UTC" firstStartedPulling="2025-10-03 11:30:54.509689748 +0000 UTC m=+6436.306321605" lastFinishedPulling="2025-10-03 11:31:02.402476633 +0000 UTC m=+6444.199108490" observedRunningTime="2025-10-03 11:31:03.680056723 +0000 UTC m=+6445.476688610" watchObservedRunningTime="2025-10-03 11:31:03.693079828 +0000 UTC m=+6445.489711685" Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.718367 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-68df8" podStartSLOduration=2.540614864 podStartE2EDuration="9.718344789s" podCreationTimestamp="2025-10-03 11:30:54 +0000 UTC" firstStartedPulling="2025-10-03 11:30:55.229771527 +0000 UTC m=+6437.026403384" lastFinishedPulling="2025-10-03 11:31:02.407501452 +0000 UTC m=+6444.204133309" observedRunningTime="2025-10-03 11:31:03.704314438 +0000 UTC m=+6445.500946295" watchObservedRunningTime="2025-10-03 11:31:03.718344789 +0000 UTC m=+6445.514976646" Oct 03 11:31:03 crc kubenswrapper[4990]: I1003 11:31:03.758769 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-62fwf" podStartSLOduration=3.288172421 podStartE2EDuration="10.75874453s" podCreationTimestamp="2025-10-03 11:30:53 +0000 UTC" firstStartedPulling="2025-10-03 11:30:55.036165619 +0000 UTC m=+6436.832797476" lastFinishedPulling="2025-10-03 11:31:02.506737728 +0000 UTC m=+6444.303369585" observedRunningTime="2025-10-03 11:31:03.743800395 +0000 UTC m=+6445.540432272" watchObservedRunningTime="2025-10-03 11:31:03.75874453 +0000 UTC m=+6445.555376387" Oct 03 11:31:05 crc kubenswrapper[4990]: I1003 11:31:05.872461 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:31:05 crc kubenswrapper[4990]: E1003 11:31:05.873079 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:31:06 crc kubenswrapper[4990]: I1003 11:31:06.025624 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mdqzf"] Oct 03 11:31:06 crc kubenswrapper[4990]: I1003 11:31:06.035886 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mdqzf"] Oct 03 11:31:06 crc kubenswrapper[4990]: I1003 11:31:06.915033 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ec54c8-f8df-4d83-921b-92df7a464ab0" path="/var/lib/kubelet/pods/e7ec54c8-f8df-4d83-921b-92df7a464ab0/volumes" Oct 03 11:31:14 crc kubenswrapper[4990]: I1003 11:31:14.534600 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-68df8" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.137950 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.138810 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" containerName="openstackclient" containerID="cri-o://eee9ddb4b22ed62d57a21ad6d7caf3839300771b294daf1bc986afbb0bee418a" gracePeriod=2 Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.162041 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.194685 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:17 crc kubenswrapper[4990]: E1003 11:31:17.195108 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" containerName="openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.195125 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" containerName="openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.195337 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" containerName="openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.197804 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.210210 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" podUID="bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.219150 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.249705 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:17 crc kubenswrapper[4990]: E1003 11:31:17.250885 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-ghp7x openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.270810 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.287578 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.289476 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.300490 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.308085 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghp7x\" (UniqueName: \"kubernetes.io/projected/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-kube-api-access-ghp7x\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.308317 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.308497 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.308750 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.309906 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.410321 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.410573 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghp7x\" (UniqueName: \"kubernetes.io/projected/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-kube-api-access-ghp7x\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.410637 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.410675 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.410702 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6xc8\" (UniqueName: \"kubernetes.io/projected/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-kube-api-access-z6xc8\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.410736 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.410812 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.410961 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.412406 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: E1003 11:31:17.421816 4990 projected.go:194] Error preparing data for projected volume kube-api-access-ghp7x for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (bfafcdcd-a3fe-4456-a9ef-bee2dff7c923) does not match the UID in record. The object might have been deleted and then recreated Oct 03 11:31:17 crc kubenswrapper[4990]: E1003 11:31:17.421897 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-kube-api-access-ghp7x podName:bfafcdcd-a3fe-4456-a9ef-bee2dff7c923 nodeName:}" failed. No retries permitted until 2025-10-03 11:31:17.921876057 +0000 UTC m=+6459.718507914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ghp7x" (UniqueName: "kubernetes.io/projected/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-kube-api-access-ghp7x") pod "openstackclient" (UID: "bfafcdcd-a3fe-4456-a9ef-bee2dff7c923") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (bfafcdcd-a3fe-4456-a9ef-bee2dff7c923) does not match the UID in record. The object might have been deleted and then recreated Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.431826 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.432220 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.514702 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.514758 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6xc8\" (UniqueName: \"kubernetes.io/projected/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-kube-api-access-z6xc8\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.514806 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.514837 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.523079 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.524208 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.524558 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.539119 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.544650 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.547963 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4qqdc" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.550728 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6xc8\" (UniqueName: \"kubernetes.io/projected/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-kube-api-access-z6xc8\") pod \"openstackclient\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.577271 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.633625 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.723669 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7kpk\" (UniqueName: \"kubernetes.io/projected/42d04580-6d3d-47a8-b2f1-5e3b5314f21f-kube-api-access-k7kpk\") pod \"kube-state-metrics-0\" (UID: \"42d04580-6d3d-47a8-b2f1-5e3b5314f21f\") " pod="openstack/kube-state-metrics-0" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.799378 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.826638 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.826706 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7kpk\" (UniqueName: \"kubernetes.io/projected/42d04580-6d3d-47a8-b2f1-5e3b5314f21f-kube-api-access-k7kpk\") pod \"kube-state-metrics-0\" (UID: \"42d04580-6d3d-47a8-b2f1-5e3b5314f21f\") " pod="openstack/kube-state-metrics-0" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.855467 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7kpk\" (UniqueName: \"kubernetes.io/projected/42d04580-6d3d-47a8-b2f1-5e3b5314f21f-kube-api-access-k7kpk\") pod \"kube-state-metrics-0\" (UID: \"42d04580-6d3d-47a8-b2f1-5e3b5314f21f\") " pod="openstack/kube-state-metrics-0" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.863339 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.866756 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" Oct 03 11:31:17 crc kubenswrapper[4990]: I1003 11:31:17.930083 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghp7x\" (UniqueName: \"kubernetes.io/projected/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-kube-api-access-ghp7x\") pod \"openstackclient\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " pod="openstack/openstackclient" Oct 03 11:31:17 crc kubenswrapper[4990]: E1003 11:31:17.937804 4990 projected.go:194] Error preparing data for projected volume kube-api-access-ghp7x for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (bfafcdcd-a3fe-4456-a9ef-bee2dff7c923) does not match the UID in record. The object might have been deleted and then recreated Oct 03 11:31:17 crc kubenswrapper[4990]: E1003 11:31:17.937872 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-kube-api-access-ghp7x podName:bfafcdcd-a3fe-4456-a9ef-bee2dff7c923 nodeName:}" failed. No retries permitted until 2025-10-03 11:31:18.937853338 +0000 UTC m=+6460.734485195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ghp7x" (UniqueName: "kubernetes.io/projected/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-kube-api-access-ghp7x") pod "openstackclient" (UID: "bfafcdcd-a3fe-4456-a9ef-bee2dff7c923") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (bfafcdcd-a3fe-4456-a9ef-bee2dff7c923) does not match the UID in record. The object might have been deleted and then recreated Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.031156 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config\") pod \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.031260 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config-secret\") pod \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.031446 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-combined-ca-bundle\") pod \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\" (UID: \"bfafcdcd-a3fe-4456-a9ef-bee2dff7c923\") " Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.031783 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" (UID: "bfafcdcd-a3fe-4456-a9ef-bee2dff7c923"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.032017 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.032036 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghp7x\" (UniqueName: \"kubernetes.io/projected/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-kube-api-access-ghp7x\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.035260 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" (UID: "bfafcdcd-a3fe-4456-a9ef-bee2dff7c923"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.035785 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" (UID: "bfafcdcd-a3fe-4456-a9ef-bee2dff7c923"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.047903 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.136004 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.136050 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.158674 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.161891 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.169286 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.169787 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-pxmpt" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.169950 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.170088 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.221601 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.341058 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9d0ce5a3-e78d-4cfa-a405-596641465359-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.341164 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d0ce5a3-e78d-4cfa-a405-596641465359-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.341453 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d0ce5a3-e78d-4cfa-a405-596641465359-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.341579 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d0ce5a3-e78d-4cfa-a405-596641465359-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.341617 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9d0ce5a3-e78d-4cfa-a405-596641465359-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.341858 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxct\" (UniqueName: \"kubernetes.io/projected/9d0ce5a3-e78d-4cfa-a405-596641465359-kube-api-access-ndxct\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.409121 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.444834 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9d0ce5a3-e78d-4cfa-a405-596641465359-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.444931 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d0ce5a3-e78d-4cfa-a405-596641465359-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.445016 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d0ce5a3-e78d-4cfa-a405-596641465359-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.445051 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d0ce5a3-e78d-4cfa-a405-596641465359-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.445073 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9d0ce5a3-e78d-4cfa-a405-596641465359-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.445108 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxct\" (UniqueName: \"kubernetes.io/projected/9d0ce5a3-e78d-4cfa-a405-596641465359-kube-api-access-ndxct\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.447082 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9d0ce5a3-e78d-4cfa-a405-596641465359-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.452991 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9d0ce5a3-e78d-4cfa-a405-596641465359-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.454435 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9d0ce5a3-e78d-4cfa-a405-596641465359-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.458082 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9d0ce5a3-e78d-4cfa-a405-596641465359-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.458719 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9d0ce5a3-e78d-4cfa-a405-596641465359-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.478488 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxct\" (UniqueName: \"kubernetes.io/projected/9d0ce5a3-e78d-4cfa-a405-596641465359-kube-api-access-ndxct\") pod \"alertmanager-metric-storage-0\" (UID: \"9d0ce5a3-e78d-4cfa-a405-596641465359\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.517813 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.718557 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.775955 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.778345 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.794380 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-r4kz8" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.794852 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.794968 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.795090 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.795187 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.800830 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.822913 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.857966 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.858064 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-config\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.858107 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.858156 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/983a5c7a-2886-45a2-9ca7-d432c7d79634-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.858191 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.858231 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4ck5\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-kube-api-access-q4ck5\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.858258 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/983a5c7a-2886-45a2-9ca7-d432c7d79634-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.858344 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.879268 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:31:18 crc kubenswrapper[4990]: E1003 11:31:18.879551 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.953284 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" path="/var/lib/kubelet/pods/bfafcdcd-a3fe-4456-a9ef-bee2dff7c923/volumes" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.953830 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42d04580-6d3d-47a8-b2f1-5e3b5314f21f","Type":"ContainerStarted","Data":"e6eca64076d4eb41ca36d85bd1f764e76fe348eee2d4f357d63a66a68e068639"} Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.954589 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.955082 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0a6c1508-60e9-4a8a-a94b-42a69020f6aa","Type":"ContainerStarted","Data":"994d9db06f19182ffb08fdea75f220bef8b69af9317bb9ee7d237bdc308e21d5"} Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.989758 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.989809 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4ck5\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-kube-api-access-q4ck5\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.989855 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/983a5c7a-2886-45a2-9ca7-d432c7d79634-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.989934 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.990065 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.990123 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-config\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.990179 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:18 crc kubenswrapper[4990]: I1003 11:31:18.990337 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/983a5c7a-2886-45a2-9ca7-d432c7d79634-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.014856 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/983a5c7a-2886-45a2-9ca7-d432c7d79634-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.037179 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.040205 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-config\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.041450 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.042054 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.069033 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="bfafcdcd-a3fe-4456-a9ef-bee2dff7c923" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.102455 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.102533 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8d775c517a9483db05ad2f3a1e80ecb74560382def537dc91feb7eb4e9c1dda4/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.138935 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/983a5c7a-2886-45a2-9ca7-d432c7d79634-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.160360 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4ck5\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-kube-api-access-q4ck5\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.277774 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"prometheus-metric-storage-0\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.479430 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:19 crc kubenswrapper[4990]: W1003 11:31:19.776216 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0ce5a3_e78d_4cfa_a405_596641465359.slice/crio-ffa9d77bd7e37033d487f87b22f56c10015935cd2f84fa6df3ad3748d9bc7039 WatchSource:0}: Error finding container ffa9d77bd7e37033d487f87b22f56c10015935cd2f84fa6df3ad3748d9bc7039: Status 404 returned error can't find the container with id ffa9d77bd7e37033d487f87b22f56c10015935cd2f84fa6df3ad3748d9bc7039 Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.788674 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.967707 4990 generic.go:334] "Generic (PLEG): container finished" podID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" containerID="eee9ddb4b22ed62d57a21ad6d7caf3839300771b294daf1bc986afbb0bee418a" exitCode=137 Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.971786 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9d0ce5a3-e78d-4cfa-a405-596641465359","Type":"ContainerStarted","Data":"ffa9d77bd7e37033d487f87b22f56c10015935cd2f84fa6df3ad3748d9bc7039"} Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.978246 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42d04580-6d3d-47a8-b2f1-5e3b5314f21f","Type":"ContainerStarted","Data":"160434ae19f253a2221271e7ef00e021da331c43fd335bad77dccfc278d5c6d2"} Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.979455 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 11:31:19 crc kubenswrapper[4990]: I1003 11:31:19.989729 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0a6c1508-60e9-4a8a-a94b-42a69020f6aa","Type":"ContainerStarted","Data":"8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2"} Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.008113 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.46512231 podStartE2EDuration="3.008090377s" podCreationTimestamp="2025-10-03 11:31:17 +0000 UTC" firstStartedPulling="2025-10-03 11:31:18.756886577 +0000 UTC m=+6460.553518434" lastFinishedPulling="2025-10-03 11:31:19.299854644 +0000 UTC m=+6461.096486501" observedRunningTime="2025-10-03 11:31:19.997799472 +0000 UTC m=+6461.794431339" watchObservedRunningTime="2025-10-03 11:31:20.008090377 +0000 UTC m=+6461.804722254" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.021863 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.021846751 podStartE2EDuration="3.021846751s" podCreationTimestamp="2025-10-03 11:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:31:20.014065111 +0000 UTC m=+6461.810696978" watchObservedRunningTime="2025-10-03 11:31:20.021846751 +0000 UTC m=+6461.818478598" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.182338 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.189784 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.233492 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:20 crc kubenswrapper[4990]: W1003 11:31:20.242251 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod983a5c7a_2886_45a2_9ca7_d432c7d79634.slice/crio-c8459d1214f30b71a27d735b4182fd3e44b769e59b3497ee8109d8112da8385c WatchSource:0}: Error finding container c8459d1214f30b71a27d735b4182fd3e44b769e59b3497ee8109d8112da8385c: Status 404 returned error can't find the container with id c8459d1214f30b71a27d735b4182fd3e44b769e59b3497ee8109d8112da8385c Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.308094 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-combined-ca-bundle\") pod \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.308579 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config-secret\") pod \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.308677 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config\") pod \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.308752 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76zcm\" (UniqueName: \"kubernetes.io/projected/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-kube-api-access-76zcm\") pod \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\" (UID: \"17e9d7f7-f069-4491-bd0e-f40b43ed51fe\") " Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.323386 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-kube-api-access-76zcm" (OuterVolumeSpecName: "kube-api-access-76zcm") pod "17e9d7f7-f069-4491-bd0e-f40b43ed51fe" (UID: "17e9d7f7-f069-4491-bd0e-f40b43ed51fe"). InnerVolumeSpecName "kube-api-access-76zcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.380738 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "17e9d7f7-f069-4491-bd0e-f40b43ed51fe" (UID: "17e9d7f7-f069-4491-bd0e-f40b43ed51fe"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.404860 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17e9d7f7-f069-4491-bd0e-f40b43ed51fe" (UID: "17e9d7f7-f069-4491-bd0e-f40b43ed51fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.413922 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.413971 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76zcm\" (UniqueName: \"kubernetes.io/projected/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-kube-api-access-76zcm\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.413984 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.436885 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "17e9d7f7-f069-4491-bd0e-f40b43ed51fe" (UID: "17e9d7f7-f069-4491-bd0e-f40b43ed51fe"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.515767 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/17e9d7f7-f069-4491-bd0e-f40b43ed51fe-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:20 crc kubenswrapper[4990]: I1003 11:31:20.883620 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" path="/var/lib/kubelet/pods/17e9d7f7-f069-4491-bd0e-f40b43ed51fe/volumes" Oct 03 11:31:21 crc kubenswrapper[4990]: I1003 11:31:21.011966 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:21 crc kubenswrapper[4990]: I1003 11:31:21.012007 4990 scope.go:117] "RemoveContainer" containerID="eee9ddb4b22ed62d57a21ad6d7caf3839300771b294daf1bc986afbb0bee418a" Oct 03 11:31:21 crc kubenswrapper[4990]: I1003 11:31:21.015462 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" Oct 03 11:31:21 crc kubenswrapper[4990]: I1003 11:31:21.015529 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerStarted","Data":"c8459d1214f30b71a27d735b4182fd3e44b769e59b3497ee8109d8112da8385c"} Oct 03 11:31:21 crc kubenswrapper[4990]: I1003 11:31:21.019259 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="17e9d7f7-f069-4491-bd0e-f40b43ed51fe" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" Oct 03 11:31:27 crc kubenswrapper[4990]: I1003 11:31:27.124258 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerStarted","Data":"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0"} Oct 03 11:31:27 crc kubenswrapper[4990]: I1003 11:31:27.126093 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9d0ce5a3-e78d-4cfa-a405-596641465359","Type":"ContainerStarted","Data":"e0e74d97078f93085d4c3e92c1e656f287f72520bd7c925552c97c4404c00c1c"} Oct 03 11:31:28 crc kubenswrapper[4990]: I1003 11:31:28.052167 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 11:31:29 crc kubenswrapper[4990]: I1003 11:31:29.871444 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:31:29 crc kubenswrapper[4990]: E1003 11:31:29.872011 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:31:34 crc kubenswrapper[4990]: I1003 11:31:34.204409 4990 generic.go:334] "Generic (PLEG): container finished" podID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerID="2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0" exitCode=0 Oct 03 11:31:34 crc kubenswrapper[4990]: I1003 11:31:34.204501 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerDied","Data":"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0"} Oct 03 11:31:34 crc kubenswrapper[4990]: I1003 11:31:34.214436 4990 generic.go:334] "Generic (PLEG): container finished" podID="9d0ce5a3-e78d-4cfa-a405-596641465359" containerID="e0e74d97078f93085d4c3e92c1e656f287f72520bd7c925552c97c4404c00c1c" exitCode=0 Oct 03 11:31:34 crc kubenswrapper[4990]: I1003 11:31:34.214475 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9d0ce5a3-e78d-4cfa-a405-596641465359","Type":"ContainerDied","Data":"e0e74d97078f93085d4c3e92c1e656f287f72520bd7c925552c97c4404c00c1c"} Oct 03 11:31:41 crc kubenswrapper[4990]: I1003 11:31:41.291254 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerStarted","Data":"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028"} Oct 03 11:31:41 crc kubenswrapper[4990]: I1003 11:31:41.293720 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9d0ce5a3-e78d-4cfa-a405-596641465359","Type":"ContainerStarted","Data":"15a7ba7c2681eeb61408679653171aac123a06b3bc61a87e4f40a7ea72168024"} Oct 03 11:31:43 crc kubenswrapper[4990]: I1003 11:31:43.872077 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:31:43 crc kubenswrapper[4990]: E1003 11:31:43.872940 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:31:44 crc kubenswrapper[4990]: I1003 11:31:44.346247 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9d0ce5a3-e78d-4cfa-a405-596641465359","Type":"ContainerStarted","Data":"b298c1ea95d45b871ebf05c50237e482e15c44f392aad885209c4e8427a9bd7d"} Oct 03 11:31:44 crc kubenswrapper[4990]: I1003 11:31:44.346571 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:44 crc kubenswrapper[4990]: I1003 11:31:44.348545 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 03 11:31:44 crc kubenswrapper[4990]: I1003 11:31:44.377273 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.839977914 podStartE2EDuration="26.377251328s" podCreationTimestamp="2025-10-03 11:31:18 +0000 UTC" firstStartedPulling="2025-10-03 11:31:19.787620567 +0000 UTC m=+6461.584252424" lastFinishedPulling="2025-10-03 11:31:40.324893981 +0000 UTC m=+6482.121525838" observedRunningTime="2025-10-03 11:31:44.370475654 +0000 UTC m=+6486.167107541" watchObservedRunningTime="2025-10-03 11:31:44.377251328 +0000 UTC m=+6486.173883185" Oct 03 11:31:45 crc kubenswrapper[4990]: I1003 11:31:45.361051 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerStarted","Data":"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d"} Oct 03 11:31:48 crc kubenswrapper[4990]: I1003 11:31:48.419905 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerStarted","Data":"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5"} Oct 03 11:31:48 crc kubenswrapper[4990]: I1003 11:31:48.445218 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.682114119 podStartE2EDuration="31.445202617s" podCreationTimestamp="2025-10-03 11:31:17 +0000 UTC" firstStartedPulling="2025-10-03 11:31:20.24426693 +0000 UTC m=+6462.040898788" lastFinishedPulling="2025-10-03 11:31:48.007355419 +0000 UTC m=+6489.803987286" observedRunningTime="2025-10-03 11:31:48.444311764 +0000 UTC m=+6490.240943611" watchObservedRunningTime="2025-10-03 11:31:48.445202617 +0000 UTC m=+6490.241834474" Oct 03 11:31:49 crc kubenswrapper[4990]: I1003 11:31:49.480790 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:49 crc kubenswrapper[4990]: I1003 11:31:49.482309 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:49 crc kubenswrapper[4990]: I1003 11:31:49.484930 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:50 crc kubenswrapper[4990]: I1003 11:31:50.442868 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.397614 4990 scope.go:117] "RemoveContainer" containerID="1c2ddb694a307fec69c562e63d4a90372dbd043d780e0efa2ea883d8d9052174" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.424166 4990 scope.go:117] "RemoveContainer" containerID="bb0224101efd5cd61ffb912e6a4d75ff6ae4bff946a7391a2939c8cda43f41c3" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.491652 4990 scope.go:117] "RemoveContainer" containerID="8a6412045df80c6344d107bc80b500a7a5206941658e3d2d8545c0f413a8234e" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.816907 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.817165 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" containerName="openstackclient" containerID="cri-o://8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2" gracePeriod=2 Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.827167 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.848367 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:51 crc kubenswrapper[4990]: E1003 11:31:51.848875 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" containerName="openstackclient" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.848901 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" containerName="openstackclient" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.849186 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" containerName="openstackclient" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.850628 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.858493 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.868118 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" podUID="4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.938440 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.938803 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.939008 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-openstack-config\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:51 crc kubenswrapper[4990]: I1003 11:31:51.939105 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbdpw\" (UniqueName: \"kubernetes.io/projected/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-kube-api-access-lbdpw\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.040421 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.040637 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-openstack-config\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.040686 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbdpw\" (UniqueName: \"kubernetes.io/projected/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-kube-api-access-lbdpw\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.040709 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.042134 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-openstack-config\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.050082 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-openstack-config-secret\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.050276 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.081975 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbdpw\" (UniqueName: \"kubernetes.io/projected/4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a-kube-api-access-lbdpw\") pod \"openstackclient\" (UID: \"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a\") " pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.177670 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:52 crc kubenswrapper[4990]: I1003 11:31:52.791905 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 11:31:53 crc kubenswrapper[4990]: I1003 11:31:53.138240 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:53 crc kubenswrapper[4990]: I1003 11:31:53.503584 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="prometheus" containerID="cri-o://04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028" gracePeriod=600 Oct 03 11:31:53 crc kubenswrapper[4990]: I1003 11:31:53.503970 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a","Type":"ContainerStarted","Data":"c495daf145df9252b65540a38657775c475409cd2cbc3f3204dd3cc27ec8a644"} Oct 03 11:31:53 crc kubenswrapper[4990]: I1003 11:31:53.511904 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a","Type":"ContainerStarted","Data":"ee5893dc4f3af03884c1ccbbb5c34c7f24a46e5a791934e31b4f09c838eb9284"} Oct 03 11:31:53 crc kubenswrapper[4990]: I1003 11:31:53.504084 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="thanos-sidecar" containerID="cri-o://b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5" gracePeriod=600 Oct 03 11:31:53 crc kubenswrapper[4990]: I1003 11:31:53.504056 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="config-reloader" containerID="cri-o://187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d" gracePeriod=600 Oct 03 11:31:53 crc kubenswrapper[4990]: I1003 11:31:53.532470 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.532452413 podStartE2EDuration="2.532452413s" podCreationTimestamp="2025-10-03 11:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:31:53.530933264 +0000 UTC m=+6495.327565141" watchObservedRunningTime="2025-10-03 11:31:53.532452413 +0000 UTC m=+6495.329084270" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.265399 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.412932 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config-secret\") pod \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.413067 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-combined-ca-bundle\") pod \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.413129 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config\") pod \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.413291 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6xc8\" (UniqueName: \"kubernetes.io/projected/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-kube-api-access-z6xc8\") pod \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\" (UID: \"0a6c1508-60e9-4a8a-a94b-42a69020f6aa\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.427388 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-kube-api-access-z6xc8" (OuterVolumeSpecName: "kube-api-access-z6xc8") pod "0a6c1508-60e9-4a8a-a94b-42a69020f6aa" (UID: "0a6c1508-60e9-4a8a-a94b-42a69020f6aa"). InnerVolumeSpecName "kube-api-access-z6xc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.462646 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a6c1508-60e9-4a8a-a94b-42a69020f6aa" (UID: "0a6c1508-60e9-4a8a-a94b-42a69020f6aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.467578 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0a6c1508-60e9-4a8a-a94b-42a69020f6aa" (UID: "0a6c1508-60e9-4a8a-a94b-42a69020f6aa"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.492378 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0a6c1508-60e9-4a8a-a94b-42a69020f6aa" (UID: "0a6c1508-60e9-4a8a-a94b-42a69020f6aa"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.520601 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.520629 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.520639 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.520648 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6xc8\" (UniqueName: \"kubernetes.io/projected/0a6c1508-60e9-4a8a-a94b-42a69020f6aa-kube-api-access-z6xc8\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.541053 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.541849 4990 generic.go:334] "Generic (PLEG): container finished" podID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerID="b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5" exitCode=0 Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.541876 4990 generic.go:334] "Generic (PLEG): container finished" podID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerID="187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d" exitCode=0 Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.541887 4990 generic.go:334] "Generic (PLEG): container finished" podID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerID="04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028" exitCode=0 Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.541955 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerDied","Data":"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5"} Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.541984 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerDied","Data":"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d"} Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.541997 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerDied","Data":"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028"} Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.542008 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"983a5c7a-2886-45a2-9ca7-d432c7d79634","Type":"ContainerDied","Data":"c8459d1214f30b71a27d735b4182fd3e44b769e59b3497ee8109d8112da8385c"} Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.542026 4990 scope.go:117] "RemoveContainer" containerID="b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.551865 4990 generic.go:334] "Generic (PLEG): container finished" podID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" containerID="8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2" exitCode=137 Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.552484 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.600995 4990 scope.go:117] "RemoveContainer" containerID="187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.606749 4990 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" podUID="4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.621681 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-config\") pod \"983a5c7a-2886-45a2-9ca7-d432c7d79634\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.621751 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-web-config\") pod \"983a5c7a-2886-45a2-9ca7-d432c7d79634\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.621820 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-tls-assets\") pod \"983a5c7a-2886-45a2-9ca7-d432c7d79634\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.621856 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-thanos-prometheus-http-client-file\") pod \"983a5c7a-2886-45a2-9ca7-d432c7d79634\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.621918 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4ck5\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-kube-api-access-q4ck5\") pod \"983a5c7a-2886-45a2-9ca7-d432c7d79634\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.622072 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"983a5c7a-2886-45a2-9ca7-d432c7d79634\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.622161 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/983a5c7a-2886-45a2-9ca7-d432c7d79634-prometheus-metric-storage-rulefiles-0\") pod \"983a5c7a-2886-45a2-9ca7-d432c7d79634\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.622240 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/983a5c7a-2886-45a2-9ca7-d432c7d79634-config-out\") pod \"983a5c7a-2886-45a2-9ca7-d432c7d79634\" (UID: \"983a5c7a-2886-45a2-9ca7-d432c7d79634\") " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.624877 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/983a5c7a-2886-45a2-9ca7-d432c7d79634-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "983a5c7a-2886-45a2-9ca7-d432c7d79634" (UID: "983a5c7a-2886-45a2-9ca7-d432c7d79634"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.625818 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-config" (OuterVolumeSpecName: "config") pod "983a5c7a-2886-45a2-9ca7-d432c7d79634" (UID: "983a5c7a-2886-45a2-9ca7-d432c7d79634"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.630234 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/983a5c7a-2886-45a2-9ca7-d432c7d79634-config-out" (OuterVolumeSpecName: "config-out") pod "983a5c7a-2886-45a2-9ca7-d432c7d79634" (UID: "983a5c7a-2886-45a2-9ca7-d432c7d79634"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.630443 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-kube-api-access-q4ck5" (OuterVolumeSpecName: "kube-api-access-q4ck5") pod "983a5c7a-2886-45a2-9ca7-d432c7d79634" (UID: "983a5c7a-2886-45a2-9ca7-d432c7d79634"). InnerVolumeSpecName "kube-api-access-q4ck5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.630611 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "983a5c7a-2886-45a2-9ca7-d432c7d79634" (UID: "983a5c7a-2886-45a2-9ca7-d432c7d79634"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.630654 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "983a5c7a-2886-45a2-9ca7-d432c7d79634" (UID: "983a5c7a-2886-45a2-9ca7-d432c7d79634"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.643812 4990 scope.go:117] "RemoveContainer" containerID="04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.662766 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-web-config" (OuterVolumeSpecName: "web-config") pod "983a5c7a-2886-45a2-9ca7-d432c7d79634" (UID: "983a5c7a-2886-45a2-9ca7-d432c7d79634"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.673312 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "983a5c7a-2886-45a2-9ca7-d432c7d79634" (UID: "983a5c7a-2886-45a2-9ca7-d432c7d79634"). InnerVolumeSpecName "pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.726468 4990 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/983a5c7a-2886-45a2-9ca7-d432c7d79634-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.726549 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.726559 4990 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.726569 4990 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.726579 4990 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/983a5c7a-2886-45a2-9ca7-d432c7d79634-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.726590 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4ck5\" (UniqueName: \"kubernetes.io/projected/983a5c7a-2886-45a2-9ca7-d432c7d79634-kube-api-access-q4ck5\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.726629 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") on node \"crc\" " Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.726644 4990 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/983a5c7a-2886-45a2-9ca7-d432c7d79634-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.768100 4990 scope.go:117] "RemoveContainer" containerID="2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.768577 4990 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.768734 4990 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3") on node "crc" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.825713 4990 scope.go:117] "RemoveContainer" containerID="b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5" Oct 03 11:31:54 crc kubenswrapper[4990]: E1003 11:31:54.826359 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5\": container with ID starting with b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5 not found: ID does not exist" containerID="b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.826408 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5"} err="failed to get container status \"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5\": rpc error: code = NotFound desc = could not find container \"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5\": container with ID starting with b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.826439 4990 scope.go:117] "RemoveContainer" containerID="187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.828358 4990 reconciler_common.go:293] "Volume detached for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") on node \"crc\" DevicePath \"\"" Oct 03 11:31:54 crc kubenswrapper[4990]: E1003 11:31:54.829032 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d\": container with ID starting with 187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d not found: ID does not exist" containerID="187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.829074 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d"} err="failed to get container status \"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d\": rpc error: code = NotFound desc = could not find container \"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d\": container with ID starting with 187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.829105 4990 scope.go:117] "RemoveContainer" containerID="04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028" Oct 03 11:31:54 crc kubenswrapper[4990]: E1003 11:31:54.829565 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028\": container with ID starting with 04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028 not found: ID does not exist" containerID="04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.829624 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028"} err="failed to get container status \"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028\": rpc error: code = NotFound desc = could not find container \"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028\": container with ID starting with 04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.829663 4990 scope.go:117] "RemoveContainer" containerID="2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0" Oct 03 11:31:54 crc kubenswrapper[4990]: E1003 11:31:54.830130 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0\": container with ID starting with 2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0 not found: ID does not exist" containerID="2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.830166 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0"} err="failed to get container status \"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0\": rpc error: code = NotFound desc = could not find container \"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0\": container with ID starting with 2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.830185 4990 scope.go:117] "RemoveContainer" containerID="b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.831731 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5"} err="failed to get container status \"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5\": rpc error: code = NotFound desc = could not find container \"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5\": container with ID starting with b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.831778 4990 scope.go:117] "RemoveContainer" containerID="187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.832294 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d"} err="failed to get container status \"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d\": rpc error: code = NotFound desc = could not find container \"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d\": container with ID starting with 187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.832319 4990 scope.go:117] "RemoveContainer" containerID="04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.833503 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028"} err="failed to get container status \"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028\": rpc error: code = NotFound desc = could not find container \"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028\": container with ID starting with 04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.833546 4990 scope.go:117] "RemoveContainer" containerID="2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.834506 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0"} err="failed to get container status \"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0\": rpc error: code = NotFound desc = could not find container \"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0\": container with ID starting with 2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.834573 4990 scope.go:117] "RemoveContainer" containerID="b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.834890 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5"} err="failed to get container status \"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5\": rpc error: code = NotFound desc = could not find container \"b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5\": container with ID starting with b7e53e715d42d3a61bf0d8fa2ea0c966c11080cc139e5aa7d508fcbc4deb56a5 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.834916 4990 scope.go:117] "RemoveContainer" containerID="187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.835881 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d"} err="failed to get container status \"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d\": rpc error: code = NotFound desc = could not find container \"187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d\": container with ID starting with 187b93b776ef0ad5328aaa087338d26093b55ccd3aaac12e9319cc1f395b618d not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.835912 4990 scope.go:117] "RemoveContainer" containerID="04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.836630 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028"} err="failed to get container status \"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028\": rpc error: code = NotFound desc = could not find container \"04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028\": container with ID starting with 04afa46fca6543f882127e2bfca74a6cc3945c62560c35070f1ee5267f066028 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.836657 4990 scope.go:117] "RemoveContainer" containerID="2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.838273 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0"} err="failed to get container status \"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0\": rpc error: code = NotFound desc = could not find container \"2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0\": container with ID starting with 2de32f9b761e86f741449b485d88aa74367ac26029f57f7eaa7da1e5563b0fe0 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.838301 4990 scope.go:117] "RemoveContainer" containerID="8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.860396 4990 scope.go:117] "RemoveContainer" containerID="8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2" Oct 03 11:31:54 crc kubenswrapper[4990]: E1003 11:31:54.860979 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2\": container with ID starting with 8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2 not found: ID does not exist" containerID="8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.861019 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2"} err="failed to get container status \"8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2\": rpc error: code = NotFound desc = could not find container \"8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2\": container with ID starting with 8127ab5ad5bd87f2e2f5d3dcba4f8d6d7f8345906766ec8375d33597177693b2 not found: ID does not exist" Oct 03 11:31:54 crc kubenswrapper[4990]: I1003 11:31:54.886205 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6c1508-60e9-4a8a-a94b-42a69020f6aa" path="/var/lib/kubelet/pods/0a6c1508-60e9-4a8a-a94b-42a69020f6aa/volumes" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.569788 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.602289 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.619927 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.628855 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:55 crc kubenswrapper[4990]: E1003 11:31:55.629967 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="config-reloader" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.629990 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="config-reloader" Oct 03 11:31:55 crc kubenswrapper[4990]: E1003 11:31:55.630022 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="thanos-sidecar" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.630028 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="thanos-sidecar" Oct 03 11:31:55 crc kubenswrapper[4990]: E1003 11:31:55.630047 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="init-config-reloader" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.630054 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="init-config-reloader" Oct 03 11:31:55 crc kubenswrapper[4990]: E1003 11:31:55.630065 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="prometheus" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.630071 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="prometheus" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.630259 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="thanos-sidecar" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.630599 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="config-reloader" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.630612 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="prometheus" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.632809 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.638821 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.638886 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.638839 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.639132 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.639690 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-r4kz8" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.639969 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.657008 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.659636 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750361 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/68ed3972-355d-4bd7-a2db-91814bf69cef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750400 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh47q\" (UniqueName: \"kubernetes.io/projected/68ed3972-355d-4bd7-a2db-91814bf69cef-kube-api-access-mh47q\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750448 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750472 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-config\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750498 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750564 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750593 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750659 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750715 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/68ed3972-355d-4bd7-a2db-91814bf69cef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750911 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/68ed3972-355d-4bd7-a2db-91814bf69cef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.750979 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853019 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853372 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/68ed3972-355d-4bd7-a2db-91814bf69cef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853414 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/68ed3972-355d-4bd7-a2db-91814bf69cef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853439 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853534 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/68ed3972-355d-4bd7-a2db-91814bf69cef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853553 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh47q\" (UniqueName: \"kubernetes.io/projected/68ed3972-355d-4bd7-a2db-91814bf69cef-kube-api-access-mh47q\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853585 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853609 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-config\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853632 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853647 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.853667 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.854725 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/68ed3972-355d-4bd7-a2db-91814bf69cef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.856589 4990 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.856650 4990 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8d775c517a9483db05ad2f3a1e80ecb74560382def537dc91feb7eb4e9c1dda4/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.859304 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-config\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.859656 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.859663 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/68ed3972-355d-4bd7-a2db-91814bf69cef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.861160 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.866278 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.866483 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.866829 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/68ed3972-355d-4bd7-a2db-91814bf69cef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.867364 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/68ed3972-355d-4bd7-a2db-91814bf69cef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.875271 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh47q\" (UniqueName: \"kubernetes.io/projected/68ed3972-355d-4bd7-a2db-91814bf69cef-kube-api-access-mh47q\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.917493 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-28092aba-5858-446c-ac0f-ba8bbdaefad3\") pod \"prometheus-metric-storage-0\" (UID: \"68ed3972-355d-4bd7-a2db-91814bf69cef\") " pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:55 crc kubenswrapper[4990]: I1003 11:31:55.958016 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.389264 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.393102 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.398959 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.399210 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.408148 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.471524 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.471777 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-log-httpd\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.471859 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-run-httpd\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.472218 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-config-data\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.472434 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.472473 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fgjx\" (UniqueName: \"kubernetes.io/projected/51080cd0-384c-4254-a8e2-6524d8848708-kube-api-access-8fgjx\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.472722 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-scripts\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.497777 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.574056 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-scripts\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.574356 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.574375 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-log-httpd\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.574397 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-run-httpd\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.574462 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-config-data\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.574533 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.574550 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fgjx\" (UniqueName: \"kubernetes.io/projected/51080cd0-384c-4254-a8e2-6524d8848708-kube-api-access-8fgjx\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.574920 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-log-httpd\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.575062 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-run-httpd\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.581300 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.581406 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-config-data\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.581476 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-scripts\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.583308 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.587609 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"68ed3972-355d-4bd7-a2db-91814bf69cef","Type":"ContainerStarted","Data":"fffa608825c1bd28121a7a88c144312621165d05ced0de2bd6ebf1f331792453"} Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.592409 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fgjx\" (UniqueName: \"kubernetes.io/projected/51080cd0-384c-4254-a8e2-6524d8848708-kube-api-access-8fgjx\") pod \"ceilometer-0\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.729037 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.872994 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:31:56 crc kubenswrapper[4990]: E1003 11:31:56.873204 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:31:56 crc kubenswrapper[4990]: I1003 11:31:56.889171 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" path="/var/lib/kubelet/pods/983a5c7a-2886-45a2-9ca7-d432c7d79634/volumes" Oct 03 11:31:57 crc kubenswrapper[4990]: I1003 11:31:57.190706 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:31:57 crc kubenswrapper[4990]: I1003 11:31:57.481332 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="983a5c7a-2886-45a2-9ca7-d432c7d79634" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.148:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 11:31:57 crc kubenswrapper[4990]: I1003 11:31:57.597460 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerStarted","Data":"79b44459146c7df7468d7ca5d476a1a0fdf6b30715ea1ff1d6c3063d6bca07a4"} Oct 03 11:31:58 crc kubenswrapper[4990]: I1003 11:31:58.616789 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerStarted","Data":"d16c6c3bdfcd1ab0ea2bc6c88f1b5ee61613eea0d97f3d93714b997c7083039f"} Oct 03 11:31:59 crc kubenswrapper[4990]: I1003 11:31:59.626646 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerStarted","Data":"6a3e40c9f234efc7e30639529a1fbbfc4416936741e319dfafc29cae513a52f4"} Oct 03 11:32:00 crc kubenswrapper[4990]: I1003 11:32:00.639170 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerStarted","Data":"335e5ce09723225238f65a7c690d090b9e3bc14665facfa16b538287b6699613"} Oct 03 11:32:00 crc kubenswrapper[4990]: I1003 11:32:00.642198 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"68ed3972-355d-4bd7-a2db-91814bf69cef","Type":"ContainerStarted","Data":"ac43cd5317eac922acd61568b603019e76507ca6318bbde308dcb9b114a1f297"} Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.042259 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-cd6sv"] Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.051222 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7m2pb"] Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.062072 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jwwc2"] Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.072638 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jwwc2"] Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.081624 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7m2pb"] Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.090231 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-cd6sv"] Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.665769 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerStarted","Data":"a3caabd6334a5b55b45d5bdf44238284423e8b8c9c882e429a21392110a2fa22"} Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.666098 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.706760 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6617165489999999 podStartE2EDuration="6.706735508s" podCreationTimestamp="2025-10-03 11:31:56 +0000 UTC" firstStartedPulling="2025-10-03 11:31:57.203185109 +0000 UTC m=+6498.999816966" lastFinishedPulling="2025-10-03 11:32:02.248204068 +0000 UTC m=+6504.044835925" observedRunningTime="2025-10-03 11:32:02.68585154 +0000 UTC m=+6504.482483397" watchObservedRunningTime="2025-10-03 11:32:02.706735508 +0000 UTC m=+6504.503367375" Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.885782 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94f2641-c80a-4273-9b61-2954040a76fb" path="/var/lib/kubelet/pods/c94f2641-c80a-4273-9b61-2954040a76fb/volumes" Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.886402 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d748b646-6b70-42c8-8bc5-f8ef381f4eab" path="/var/lib/kubelet/pods/d748b646-6b70-42c8-8bc5-f8ef381f4eab/volumes" Oct 03 11:32:02 crc kubenswrapper[4990]: I1003 11:32:02.888306 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff835e0-d422-414c-a731-49acae82b765" path="/var/lib/kubelet/pods/dff835e0-d422-414c-a731-49acae82b765/volumes" Oct 03 11:32:05 crc kubenswrapper[4990]: I1003 11:32:05.299297 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-fxxg9"] Oct 03 11:32:05 crc kubenswrapper[4990]: I1003 11:32:05.302390 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fxxg9" Oct 03 11:32:05 crc kubenswrapper[4990]: I1003 11:32:05.316715 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-fxxg9"] Oct 03 11:32:05 crc kubenswrapper[4990]: I1003 11:32:05.374519 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr647\" (UniqueName: \"kubernetes.io/projected/7e224c1a-fc5b-4920-b8f7-05343df90890-kube-api-access-nr647\") pod \"aodh-db-create-fxxg9\" (UID: \"7e224c1a-fc5b-4920-b8f7-05343df90890\") " pod="openstack/aodh-db-create-fxxg9" Oct 03 11:32:05 crc kubenswrapper[4990]: I1003 11:32:05.476211 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr647\" (UniqueName: \"kubernetes.io/projected/7e224c1a-fc5b-4920-b8f7-05343df90890-kube-api-access-nr647\") pod \"aodh-db-create-fxxg9\" (UID: \"7e224c1a-fc5b-4920-b8f7-05343df90890\") " pod="openstack/aodh-db-create-fxxg9" Oct 03 11:32:05 crc kubenswrapper[4990]: I1003 11:32:05.496106 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr647\" (UniqueName: \"kubernetes.io/projected/7e224c1a-fc5b-4920-b8f7-05343df90890-kube-api-access-nr647\") pod \"aodh-db-create-fxxg9\" (UID: \"7e224c1a-fc5b-4920-b8f7-05343df90890\") " pod="openstack/aodh-db-create-fxxg9" Oct 03 11:32:05 crc kubenswrapper[4990]: I1003 11:32:05.631911 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fxxg9" Oct 03 11:32:06 crc kubenswrapper[4990]: I1003 11:32:06.154533 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-fxxg9"] Oct 03 11:32:06 crc kubenswrapper[4990]: I1003 11:32:06.728205 4990 generic.go:334] "Generic (PLEG): container finished" podID="68ed3972-355d-4bd7-a2db-91814bf69cef" containerID="ac43cd5317eac922acd61568b603019e76507ca6318bbde308dcb9b114a1f297" exitCode=0 Oct 03 11:32:06 crc kubenswrapper[4990]: I1003 11:32:06.728262 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"68ed3972-355d-4bd7-a2db-91814bf69cef","Type":"ContainerDied","Data":"ac43cd5317eac922acd61568b603019e76507ca6318bbde308dcb9b114a1f297"} Oct 03 11:32:06 crc kubenswrapper[4990]: I1003 11:32:06.736076 4990 generic.go:334] "Generic (PLEG): container finished" podID="7e224c1a-fc5b-4920-b8f7-05343df90890" containerID="c822c5c834509db335b91e0ad6cb8eeda9b4e950669ffd04590371f056abe4c4" exitCode=0 Oct 03 11:32:06 crc kubenswrapper[4990]: I1003 11:32:06.736122 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fxxg9" event={"ID":"7e224c1a-fc5b-4920-b8f7-05343df90890","Type":"ContainerDied","Data":"c822c5c834509db335b91e0ad6cb8eeda9b4e950669ffd04590371f056abe4c4"} Oct 03 11:32:06 crc kubenswrapper[4990]: I1003 11:32:06.736151 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fxxg9" event={"ID":"7e224c1a-fc5b-4920-b8f7-05343df90890","Type":"ContainerStarted","Data":"b5318fc20cc15a8eacf5dc5a2166f81481d9e794349eebcd373924c874cae04d"} Oct 03 11:32:07 crc kubenswrapper[4990]: I1003 11:32:07.747272 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"68ed3972-355d-4bd7-a2db-91814bf69cef","Type":"ContainerStarted","Data":"5fe45a7e50cd91a5bdb3859618eb7549a8434a8961f560df0f57f7fc80fe4166"} Oct 03 11:32:08 crc kubenswrapper[4990]: I1003 11:32:08.147739 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fxxg9" Oct 03 11:32:08 crc kubenswrapper[4990]: I1003 11:32:08.321181 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr647\" (UniqueName: \"kubernetes.io/projected/7e224c1a-fc5b-4920-b8f7-05343df90890-kube-api-access-nr647\") pod \"7e224c1a-fc5b-4920-b8f7-05343df90890\" (UID: \"7e224c1a-fc5b-4920-b8f7-05343df90890\") " Oct 03 11:32:08 crc kubenswrapper[4990]: I1003 11:32:08.333864 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e224c1a-fc5b-4920-b8f7-05343df90890-kube-api-access-nr647" (OuterVolumeSpecName: "kube-api-access-nr647") pod "7e224c1a-fc5b-4920-b8f7-05343df90890" (UID: "7e224c1a-fc5b-4920-b8f7-05343df90890"). InnerVolumeSpecName "kube-api-access-nr647". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:32:08 crc kubenswrapper[4990]: I1003 11:32:08.423975 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr647\" (UniqueName: \"kubernetes.io/projected/7e224c1a-fc5b-4920-b8f7-05343df90890-kube-api-access-nr647\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:08 crc kubenswrapper[4990]: I1003 11:32:08.762814 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-fxxg9" event={"ID":"7e224c1a-fc5b-4920-b8f7-05343df90890","Type":"ContainerDied","Data":"b5318fc20cc15a8eacf5dc5a2166f81481d9e794349eebcd373924c874cae04d"} Oct 03 11:32:08 crc kubenswrapper[4990]: I1003 11:32:08.762853 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5318fc20cc15a8eacf5dc5a2166f81481d9e794349eebcd373924c874cae04d" Oct 03 11:32:08 crc kubenswrapper[4990]: I1003 11:32:08.762901 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-fxxg9" Oct 03 11:32:10 crc kubenswrapper[4990]: I1003 11:32:10.783528 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"68ed3972-355d-4bd7-a2db-91814bf69cef","Type":"ContainerStarted","Data":"c4962b80eda6bc3420b5236be609219d97d1547500e809c4bc95e50ba5e74d76"} Oct 03 11:32:11 crc kubenswrapper[4990]: I1003 11:32:11.797471 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"68ed3972-355d-4bd7-a2db-91814bf69cef","Type":"ContainerStarted","Data":"543f438731cb64294ff2382dbf736ac91707cd7b6766cc883fc8ddb6a622fb4f"} Oct 03 11:32:11 crc kubenswrapper[4990]: I1003 11:32:11.840362 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.840345967 podStartE2EDuration="16.840345967s" podCreationTimestamp="2025-10-03 11:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:32:11.832745572 +0000 UTC m=+6513.629377469" watchObservedRunningTime="2025-10-03 11:32:11.840345967 +0000 UTC m=+6513.636977824" Oct 03 11:32:11 crc kubenswrapper[4990]: I1003 11:32:11.873139 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:32:11 crc kubenswrapper[4990]: E1003 11:32:11.873959 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.032535 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2683-account-create-2s8dt"] Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.045016 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2683-account-create-2s8dt"] Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.058209 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e71c-account-create-s96zj"] Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.067754 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e71c-account-create-s96zj"] Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.075560 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-05a9-account-create-2sp9s"] Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.082850 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-05a9-account-create-2sp9s"] Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.889925 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d122437-a3dd-491f-98b1-0487982c980c" path="/var/lib/kubelet/pods/4d122437-a3dd-491f-98b1-0487982c980c/volumes" Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.891429 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ede1ac-72d9-4a04-b026-e206a826e575" path="/var/lib/kubelet/pods/51ede1ac-72d9-4a04-b026-e206a826e575/volumes" Oct 03 11:32:12 crc kubenswrapper[4990]: I1003 11:32:12.894291 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f7e561-6095-4bae-96dc-b5d4fc2db2ed" path="/var/lib/kubelet/pods/93f7e561-6095-4bae-96dc-b5d4fc2db2ed/volumes" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.436348 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-1f1f-account-create-kw9mj"] Oct 03 11:32:15 crc kubenswrapper[4990]: E1003 11:32:15.437152 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e224c1a-fc5b-4920-b8f7-05343df90890" containerName="mariadb-database-create" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.437169 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e224c1a-fc5b-4920-b8f7-05343df90890" containerName="mariadb-database-create" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.437403 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e224c1a-fc5b-4920-b8f7-05343df90890" containerName="mariadb-database-create" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.448769 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1f1f-account-create-kw9mj" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.451874 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.464321 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1f1f-account-create-kw9mj"] Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.589618 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5cj\" (UniqueName: \"kubernetes.io/projected/5fc4a764-22f0-4166-b68a-b4f45b864d3c-kube-api-access-7j5cj\") pod \"aodh-1f1f-account-create-kw9mj\" (UID: \"5fc4a764-22f0-4166-b68a-b4f45b864d3c\") " pod="openstack/aodh-1f1f-account-create-kw9mj" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.692384 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5cj\" (UniqueName: \"kubernetes.io/projected/5fc4a764-22f0-4166-b68a-b4f45b864d3c-kube-api-access-7j5cj\") pod \"aodh-1f1f-account-create-kw9mj\" (UID: \"5fc4a764-22f0-4166-b68a-b4f45b864d3c\") " pod="openstack/aodh-1f1f-account-create-kw9mj" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.721808 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5cj\" (UniqueName: \"kubernetes.io/projected/5fc4a764-22f0-4166-b68a-b4f45b864d3c-kube-api-access-7j5cj\") pod \"aodh-1f1f-account-create-kw9mj\" (UID: \"5fc4a764-22f0-4166-b68a-b4f45b864d3c\") " pod="openstack/aodh-1f1f-account-create-kw9mj" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.769386 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1f1f-account-create-kw9mj" Oct 03 11:32:15 crc kubenswrapper[4990]: I1003 11:32:15.958141 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 11:32:16 crc kubenswrapper[4990]: I1003 11:32:16.280580 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1f1f-account-create-kw9mj"] Oct 03 11:32:16 crc kubenswrapper[4990]: W1003 11:32:16.284949 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fc4a764_22f0_4166_b68a_b4f45b864d3c.slice/crio-3f37b1224013336c86413758ea45e8ad0047798b70fe729ce56ef67797ee3915 WatchSource:0}: Error finding container 3f37b1224013336c86413758ea45e8ad0047798b70fe729ce56ef67797ee3915: Status 404 returned error can't find the container with id 3f37b1224013336c86413758ea45e8ad0047798b70fe729ce56ef67797ee3915 Oct 03 11:32:16 crc kubenswrapper[4990]: I1003 11:32:16.883353 4990 generic.go:334] "Generic (PLEG): container finished" podID="5fc4a764-22f0-4166-b68a-b4f45b864d3c" containerID="d258f363f562fdbfbb7cb9557de6e67b8d50ea78d3876f8741bb7412aa413c6e" exitCode=0 Oct 03 11:32:16 crc kubenswrapper[4990]: I1003 11:32:16.887243 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1f1f-account-create-kw9mj" event={"ID":"5fc4a764-22f0-4166-b68a-b4f45b864d3c","Type":"ContainerDied","Data":"d258f363f562fdbfbb7cb9557de6e67b8d50ea78d3876f8741bb7412aa413c6e"} Oct 03 11:32:16 crc kubenswrapper[4990]: I1003 11:32:16.887283 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1f1f-account-create-kw9mj" event={"ID":"5fc4a764-22f0-4166-b68a-b4f45b864d3c","Type":"ContainerStarted","Data":"3f37b1224013336c86413758ea45e8ad0047798b70fe729ce56ef67797ee3915"} Oct 03 11:32:18 crc kubenswrapper[4990]: I1003 11:32:18.336603 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1f1f-account-create-kw9mj" Oct 03 11:32:18 crc kubenswrapper[4990]: I1003 11:32:18.491019 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j5cj\" (UniqueName: \"kubernetes.io/projected/5fc4a764-22f0-4166-b68a-b4f45b864d3c-kube-api-access-7j5cj\") pod \"5fc4a764-22f0-4166-b68a-b4f45b864d3c\" (UID: \"5fc4a764-22f0-4166-b68a-b4f45b864d3c\") " Oct 03 11:32:18 crc kubenswrapper[4990]: I1003 11:32:18.500140 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc4a764-22f0-4166-b68a-b4f45b864d3c-kube-api-access-7j5cj" (OuterVolumeSpecName: "kube-api-access-7j5cj") pod "5fc4a764-22f0-4166-b68a-b4f45b864d3c" (UID: "5fc4a764-22f0-4166-b68a-b4f45b864d3c"). InnerVolumeSpecName "kube-api-access-7j5cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:32:18 crc kubenswrapper[4990]: I1003 11:32:18.594246 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j5cj\" (UniqueName: \"kubernetes.io/projected/5fc4a764-22f0-4166-b68a-b4f45b864d3c-kube-api-access-7j5cj\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:18 crc kubenswrapper[4990]: I1003 11:32:18.904641 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1f1f-account-create-kw9mj" event={"ID":"5fc4a764-22f0-4166-b68a-b4f45b864d3c","Type":"ContainerDied","Data":"3f37b1224013336c86413758ea45e8ad0047798b70fe729ce56ef67797ee3915"} Oct 03 11:32:18 crc kubenswrapper[4990]: I1003 11:32:18.904677 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f37b1224013336c86413758ea45e8ad0047798b70fe729ce56ef67797ee3915" Oct 03 11:32:18 crc kubenswrapper[4990]: I1003 11:32:18.904731 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1f1f-account-create-kw9mj" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.661826 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-5cdxf"] Oct 03 11:32:20 crc kubenswrapper[4990]: E1003 11:32:20.662724 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc4a764-22f0-4166-b68a-b4f45b864d3c" containerName="mariadb-account-create" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.662742 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc4a764-22f0-4166-b68a-b4f45b864d3c" containerName="mariadb-account-create" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.662990 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc4a764-22f0-4166-b68a-b4f45b864d3c" containerName="mariadb-account-create" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.663955 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.666121 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.666170 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fpzvx" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.666200 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.676982 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-5cdxf"] Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.740586 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn94p\" (UniqueName: \"kubernetes.io/projected/4f453696-18c2-4d5d-a8e4-051dd710c1e4-kube-api-access-kn94p\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.740666 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-config-data\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.740870 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-scripts\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.740926 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-combined-ca-bundle\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.843845 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-scripts\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.843924 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-combined-ca-bundle\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.844023 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn94p\" (UniqueName: \"kubernetes.io/projected/4f453696-18c2-4d5d-a8e4-051dd710c1e4-kube-api-access-kn94p\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.844113 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-config-data\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.850016 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-combined-ca-bundle\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.851544 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-config-data\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.861022 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-scripts\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.869571 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn94p\" (UniqueName: \"kubernetes.io/projected/4f453696-18c2-4d5d-a8e4-051dd710c1e4-kube-api-access-kn94p\") pod \"aodh-db-sync-5cdxf\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:20 crc kubenswrapper[4990]: I1003 11:32:20.987037 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:21 crc kubenswrapper[4990]: I1003 11:32:21.031993 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6nfpn"] Oct 03 11:32:21 crc kubenswrapper[4990]: I1003 11:32:21.052907 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6nfpn"] Oct 03 11:32:21 crc kubenswrapper[4990]: I1003 11:32:21.545870 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-5cdxf"] Oct 03 11:32:21 crc kubenswrapper[4990]: I1003 11:32:21.932759 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5cdxf" event={"ID":"4f453696-18c2-4d5d-a8e4-051dd710c1e4","Type":"ContainerStarted","Data":"0214b8c48df2223d719f03fd9986bdf09651dd07622480eb9988628d9cfb29d6"} Oct 03 11:32:22 crc kubenswrapper[4990]: I1003 11:32:22.887944 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcadd49-4156-411d-9013-331a25abf52b" path="/var/lib/kubelet/pods/cdcadd49-4156-411d-9013-331a25abf52b/volumes" Oct 03 11:32:25 crc kubenswrapper[4990]: I1003 11:32:25.958260 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 11:32:25 crc kubenswrapper[4990]: I1003 11:32:25.973127 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 11:32:26 crc kubenswrapper[4990]: I1003 11:32:26.007137 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 11:32:26 crc kubenswrapper[4990]: I1003 11:32:26.796407 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 11:32:26 crc kubenswrapper[4990]: I1003 11:32:26.872839 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:32:26 crc kubenswrapper[4990]: E1003 11:32:26.873275 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:32:28 crc kubenswrapper[4990]: I1003 11:32:28.034744 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5cdxf" event={"ID":"4f453696-18c2-4d5d-a8e4-051dd710c1e4","Type":"ContainerStarted","Data":"d55095ecf406905075e0bc767c8cd3f4852e55bd72d1c1151bc6f4e9e947cd43"} Oct 03 11:32:28 crc kubenswrapper[4990]: I1003 11:32:28.073423 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-5cdxf" podStartSLOduration=2.731235243 podStartE2EDuration="8.073378785s" podCreationTimestamp="2025-10-03 11:32:20 +0000 UTC" firstStartedPulling="2025-10-03 11:32:21.568019909 +0000 UTC m=+6523.364651766" lastFinishedPulling="2025-10-03 11:32:26.910163431 +0000 UTC m=+6528.706795308" observedRunningTime="2025-10-03 11:32:28.05924228 +0000 UTC m=+6529.855874147" watchObservedRunningTime="2025-10-03 11:32:28.073378785 +0000 UTC m=+6529.870010642" Oct 03 11:32:30 crc kubenswrapper[4990]: I1003 11:32:30.057941 4990 generic.go:334] "Generic (PLEG): container finished" podID="4f453696-18c2-4d5d-a8e4-051dd710c1e4" containerID="d55095ecf406905075e0bc767c8cd3f4852e55bd72d1c1151bc6f4e9e947cd43" exitCode=0 Oct 03 11:32:30 crc kubenswrapper[4990]: I1003 11:32:30.058006 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5cdxf" event={"ID":"4f453696-18c2-4d5d-a8e4-051dd710c1e4","Type":"ContainerDied","Data":"d55095ecf406905075e0bc767c8cd3f4852e55bd72d1c1151bc6f4e9e947cd43"} Oct 03 11:32:30 crc kubenswrapper[4990]: I1003 11:32:30.862535 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:32:30 crc kubenswrapper[4990]: I1003 11:32:30.863280 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="42d04580-6d3d-47a8-b2f1-5e3b5314f21f" containerName="kube-state-metrics" containerID="cri-o://160434ae19f253a2221271e7ef00e021da331c43fd335bad77dccfc278d5c6d2" gracePeriod=30 Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.073844 4990 generic.go:334] "Generic (PLEG): container finished" podID="42d04580-6d3d-47a8-b2f1-5e3b5314f21f" containerID="160434ae19f253a2221271e7ef00e021da331c43fd335bad77dccfc278d5c6d2" exitCode=2 Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.074023 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42d04580-6d3d-47a8-b2f1-5e3b5314f21f","Type":"ContainerDied","Data":"160434ae19f253a2221271e7ef00e021da331c43fd335bad77dccfc278d5c6d2"} Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.643052 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.724150 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7kpk\" (UniqueName: \"kubernetes.io/projected/42d04580-6d3d-47a8-b2f1-5e3b5314f21f-kube-api-access-k7kpk\") pod \"42d04580-6d3d-47a8-b2f1-5e3b5314f21f\" (UID: \"42d04580-6d3d-47a8-b2f1-5e3b5314f21f\") " Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.742149 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d04580-6d3d-47a8-b2f1-5e3b5314f21f-kube-api-access-k7kpk" (OuterVolumeSpecName: "kube-api-access-k7kpk") pod "42d04580-6d3d-47a8-b2f1-5e3b5314f21f" (UID: "42d04580-6d3d-47a8-b2f1-5e3b5314f21f"). InnerVolumeSpecName "kube-api-access-k7kpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.761635 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.826270 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7kpk\" (UniqueName: \"kubernetes.io/projected/42d04580-6d3d-47a8-b2f1-5e3b5314f21f-kube-api-access-k7kpk\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.927491 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn94p\" (UniqueName: \"kubernetes.io/projected/4f453696-18c2-4d5d-a8e4-051dd710c1e4-kube-api-access-kn94p\") pod \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.927573 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-scripts\") pod \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.927628 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-combined-ca-bundle\") pod \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.927718 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-config-data\") pod \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\" (UID: \"4f453696-18c2-4d5d-a8e4-051dd710c1e4\") " Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.930964 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f453696-18c2-4d5d-a8e4-051dd710c1e4-kube-api-access-kn94p" (OuterVolumeSpecName: "kube-api-access-kn94p") pod "4f453696-18c2-4d5d-a8e4-051dd710c1e4" (UID: "4f453696-18c2-4d5d-a8e4-051dd710c1e4"). InnerVolumeSpecName "kube-api-access-kn94p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.931303 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-scripts" (OuterVolumeSpecName: "scripts") pod "4f453696-18c2-4d5d-a8e4-051dd710c1e4" (UID: "4f453696-18c2-4d5d-a8e4-051dd710c1e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.963036 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f453696-18c2-4d5d-a8e4-051dd710c1e4" (UID: "4f453696-18c2-4d5d-a8e4-051dd710c1e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:31 crc kubenswrapper[4990]: I1003 11:32:31.975626 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-config-data" (OuterVolumeSpecName: "config-data") pod "4f453696-18c2-4d5d-a8e4-051dd710c1e4" (UID: "4f453696-18c2-4d5d-a8e4-051dd710c1e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.029811 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.029846 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.029855 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn94p\" (UniqueName: \"kubernetes.io/projected/4f453696-18c2-4d5d-a8e4-051dd710c1e4-kube-api-access-kn94p\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.029865 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f453696-18c2-4d5d-a8e4-051dd710c1e4-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.086585 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-5cdxf" event={"ID":"4f453696-18c2-4d5d-a8e4-051dd710c1e4","Type":"ContainerDied","Data":"0214b8c48df2223d719f03fd9986bdf09651dd07622480eb9988628d9cfb29d6"} Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.086627 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0214b8c48df2223d719f03fd9986bdf09651dd07622480eb9988628d9cfb29d6" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.086672 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-5cdxf" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.088940 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"42d04580-6d3d-47a8-b2f1-5e3b5314f21f","Type":"ContainerDied","Data":"e6eca64076d4eb41ca36d85bd1f764e76fe348eee2d4f357d63a66a68e068639"} Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.088969 4990 scope.go:117] "RemoveContainer" containerID="160434ae19f253a2221271e7ef00e021da331c43fd335bad77dccfc278d5c6d2" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.089034 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.144474 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.186657 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.199170 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:32:32 crc kubenswrapper[4990]: E1003 11:32:32.200368 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f453696-18c2-4d5d-a8e4-051dd710c1e4" containerName="aodh-db-sync" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.200418 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f453696-18c2-4d5d-a8e4-051dd710c1e4" containerName="aodh-db-sync" Oct 03 11:32:32 crc kubenswrapper[4990]: E1003 11:32:32.200489 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d04580-6d3d-47a8-b2f1-5e3b5314f21f" containerName="kube-state-metrics" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.200497 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d04580-6d3d-47a8-b2f1-5e3b5314f21f" containerName="kube-state-metrics" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.200954 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d04580-6d3d-47a8-b2f1-5e3b5314f21f" containerName="kube-state-metrics" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.200987 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f453696-18c2-4d5d-a8e4-051dd710c1e4" containerName="aodh-db-sync" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.202336 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.206145 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.206927 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.230995 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.340481 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25b5\" (UniqueName: \"kubernetes.io/projected/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-api-access-j25b5\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.340547 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.340821 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.340908 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.442859 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j25b5\" (UniqueName: \"kubernetes.io/projected/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-api-access-j25b5\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.442915 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.442997 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.443038 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.450302 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.450384 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.450873 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.462268 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25b5\" (UniqueName: \"kubernetes.io/projected/75d40fc0-48e9-410e-b2f7-2b10e6e9a58b-kube-api-access-j25b5\") pod \"kube-state-metrics-0\" (UID: \"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b\") " pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.531759 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 11:32:32 crc kubenswrapper[4990]: I1003 11:32:32.885691 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d04580-6d3d-47a8-b2f1-5e3b5314f21f" path="/var/lib/kubelet/pods/42d04580-6d3d-47a8-b2f1-5e3b5314f21f/volumes" Oct 03 11:32:33 crc kubenswrapper[4990]: I1003 11:32:33.038462 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 11:32:33 crc kubenswrapper[4990]: I1003 11:32:33.101991 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b","Type":"ContainerStarted","Data":"6aaed026ed3a663ea3a6f246e36ee932ee691b9df5d3cc8f0c1605770ccd51d3"} Oct 03 11:32:33 crc kubenswrapper[4990]: I1003 11:32:33.270388 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:33 crc kubenswrapper[4990]: I1003 11:32:33.270880 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="sg-core" containerID="cri-o://335e5ce09723225238f65a7c690d090b9e3bc14665facfa16b538287b6699613" gracePeriod=30 Oct 03 11:32:33 crc kubenswrapper[4990]: I1003 11:32:33.271014 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="ceilometer-notification-agent" containerID="cri-o://6a3e40c9f234efc7e30639529a1fbbfc4416936741e319dfafc29cae513a52f4" gracePeriod=30 Oct 03 11:32:33 crc kubenswrapper[4990]: I1003 11:32:33.271068 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="proxy-httpd" containerID="cri-o://a3caabd6334a5b55b45d5bdf44238284423e8b8c9c882e429a21392110a2fa22" gracePeriod=30 Oct 03 11:32:33 crc kubenswrapper[4990]: I1003 11:32:33.271225 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="ceilometer-central-agent" containerID="cri-o://d16c6c3bdfcd1ab0ea2bc6c88f1b5ee61613eea0d97f3d93714b997c7083039f" gracePeriod=30 Oct 03 11:32:34 crc kubenswrapper[4990]: I1003 11:32:34.123542 4990 generic.go:334] "Generic (PLEG): container finished" podID="51080cd0-384c-4254-a8e2-6524d8848708" containerID="a3caabd6334a5b55b45d5bdf44238284423e8b8c9c882e429a21392110a2fa22" exitCode=0 Oct 03 11:32:34 crc kubenswrapper[4990]: I1003 11:32:34.123889 4990 generic.go:334] "Generic (PLEG): container finished" podID="51080cd0-384c-4254-a8e2-6524d8848708" containerID="335e5ce09723225238f65a7c690d090b9e3bc14665facfa16b538287b6699613" exitCode=2 Oct 03 11:32:34 crc kubenswrapper[4990]: I1003 11:32:34.123617 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerDied","Data":"a3caabd6334a5b55b45d5bdf44238284423e8b8c9c882e429a21392110a2fa22"} Oct 03 11:32:34 crc kubenswrapper[4990]: I1003 11:32:34.123933 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerDied","Data":"335e5ce09723225238f65a7c690d090b9e3bc14665facfa16b538287b6699613"} Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.138133 4990 generic.go:334] "Generic (PLEG): container finished" podID="51080cd0-384c-4254-a8e2-6524d8848708" containerID="d16c6c3bdfcd1ab0ea2bc6c88f1b5ee61613eea0d97f3d93714b997c7083039f" exitCode=0 Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.138225 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerDied","Data":"d16c6c3bdfcd1ab0ea2bc6c88f1b5ee61613eea0d97f3d93714b997c7083039f"} Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.386878 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.391080 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.394325 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fpzvx" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.394859 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.395859 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.410331 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.536040 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-scripts\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.536606 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-config-data\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.536718 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.536782 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtvt\" (UniqueName: \"kubernetes.io/projected/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-kube-api-access-pmtvt\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.638369 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-scripts\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.638584 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-config-data\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.638647 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.638693 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtvt\" (UniqueName: \"kubernetes.io/projected/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-kube-api-access-pmtvt\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.645588 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-scripts\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.650780 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-combined-ca-bundle\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.656804 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtvt\" (UniqueName: \"kubernetes.io/projected/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-kube-api-access-pmtvt\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.662771 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-config-data\") pod \"aodh-0\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " pod="openstack/aodh-0" Oct 03 11:32:35 crc kubenswrapper[4990]: I1003 11:32:35.723667 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 11:32:36 crc kubenswrapper[4990]: I1003 11:32:36.065185 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qpvxf"] Oct 03 11:32:36 crc kubenswrapper[4990]: I1003 11:32:36.082504 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qpvxf"] Oct 03 11:32:36 crc kubenswrapper[4990]: I1003 11:32:36.574256 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 11:32:36 crc kubenswrapper[4990]: W1003 11:32:36.576950 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d0aa6af_fb85_4131_9298_5bb5c65ff45d.slice/crio-7bbfbd97ffbd89f2adcf70ec83ad016aba86418777ff74fa6675614fee8cd685 WatchSource:0}: Error finding container 7bbfbd97ffbd89f2adcf70ec83ad016aba86418777ff74fa6675614fee8cd685: Status 404 returned error can't find the container with id 7bbfbd97ffbd89f2adcf70ec83ad016aba86418777ff74fa6675614fee8cd685 Oct 03 11:32:36 crc kubenswrapper[4990]: I1003 11:32:36.886288 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f302eb-c586-4bf2-aa6b-31ae2585cace" path="/var/lib/kubelet/pods/46f302eb-c586-4bf2-aa6b-31ae2585cace/volumes" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.028399 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-2vjkx"] Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.046494 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-2vjkx"] Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.169355 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"75d40fc0-48e9-410e-b2f7-2b10e6e9a58b","Type":"ContainerStarted","Data":"82ef10b65fb636fea245b39db3e12e9eed20a5bd4b260f95c0a0784bcbc79a7e"} Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.170699 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.180758 4990 generic.go:334] "Generic (PLEG): container finished" podID="51080cd0-384c-4254-a8e2-6524d8848708" containerID="6a3e40c9f234efc7e30639529a1fbbfc4416936741e319dfafc29cae513a52f4" exitCode=0 Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.180842 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerDied","Data":"6a3e40c9f234efc7e30639529a1fbbfc4416936741e319dfafc29cae513a52f4"} Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.182598 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerStarted","Data":"7bbfbd97ffbd89f2adcf70ec83ad016aba86418777ff74fa6675614fee8cd685"} Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.196348 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.233272512 podStartE2EDuration="5.196328359s" podCreationTimestamp="2025-10-03 11:32:32 +0000 UTC" firstStartedPulling="2025-10-03 11:32:33.084696055 +0000 UTC m=+6534.881327912" lastFinishedPulling="2025-10-03 11:32:36.047751862 +0000 UTC m=+6537.844383759" observedRunningTime="2025-10-03 11:32:37.193362243 +0000 UTC m=+6538.989994090" watchObservedRunningTime="2025-10-03 11:32:37.196328359 +0000 UTC m=+6538.992960216" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.376824 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.513692 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-combined-ca-bundle\") pod \"51080cd0-384c-4254-a8e2-6524d8848708\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.513742 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-sg-core-conf-yaml\") pod \"51080cd0-384c-4254-a8e2-6524d8848708\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.513781 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fgjx\" (UniqueName: \"kubernetes.io/projected/51080cd0-384c-4254-a8e2-6524d8848708-kube-api-access-8fgjx\") pod \"51080cd0-384c-4254-a8e2-6524d8848708\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.513826 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-log-httpd\") pod \"51080cd0-384c-4254-a8e2-6524d8848708\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.513899 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-scripts\") pod \"51080cd0-384c-4254-a8e2-6524d8848708\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.513951 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-run-httpd\") pod \"51080cd0-384c-4254-a8e2-6524d8848708\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.514016 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-config-data\") pod \"51080cd0-384c-4254-a8e2-6524d8848708\" (UID: \"51080cd0-384c-4254-a8e2-6524d8848708\") " Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.518216 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51080cd0-384c-4254-a8e2-6524d8848708-kube-api-access-8fgjx" (OuterVolumeSpecName: "kube-api-access-8fgjx") pod "51080cd0-384c-4254-a8e2-6524d8848708" (UID: "51080cd0-384c-4254-a8e2-6524d8848708"). InnerVolumeSpecName "kube-api-access-8fgjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.518833 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51080cd0-384c-4254-a8e2-6524d8848708" (UID: "51080cd0-384c-4254-a8e2-6524d8848708"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.519048 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51080cd0-384c-4254-a8e2-6524d8848708" (UID: "51080cd0-384c-4254-a8e2-6524d8848708"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.525196 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-scripts" (OuterVolumeSpecName: "scripts") pod "51080cd0-384c-4254-a8e2-6524d8848708" (UID: "51080cd0-384c-4254-a8e2-6524d8848708"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.547180 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51080cd0-384c-4254-a8e2-6524d8848708" (UID: "51080cd0-384c-4254-a8e2-6524d8848708"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.616927 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.616967 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.616976 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.616986 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fgjx\" (UniqueName: \"kubernetes.io/projected/51080cd0-384c-4254-a8e2-6524d8848708-kube-api-access-8fgjx\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.616995 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51080cd0-384c-4254-a8e2-6524d8848708-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.659602 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51080cd0-384c-4254-a8e2-6524d8848708" (UID: "51080cd0-384c-4254-a8e2-6524d8848708"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.668555 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-config-data" (OuterVolumeSpecName: "config-data") pod "51080cd0-384c-4254-a8e2-6524d8848708" (UID: "51080cd0-384c-4254-a8e2-6524d8848708"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.720276 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.720308 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51080cd0-384c-4254-a8e2-6524d8848708-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:37 crc kubenswrapper[4990]: I1003 11:32:37.871802 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:32:37 crc kubenswrapper[4990]: E1003 11:32:37.872146 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.202218 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51080cd0-384c-4254-a8e2-6524d8848708","Type":"ContainerDied","Data":"79b44459146c7df7468d7ca5d476a1a0fdf6b30715ea1ff1d6c3063d6bca07a4"} Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.202551 4990 scope.go:117] "RemoveContainer" containerID="a3caabd6334a5b55b45d5bdf44238284423e8b8c9c882e429a21392110a2fa22" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.202272 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.208106 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerStarted","Data":"a6ee09724490d173463fd7c31821660af33a15f0f61c3d53312677a8b47c4807"} Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.229034 4990 scope.go:117] "RemoveContainer" containerID="335e5ce09723225238f65a7c690d090b9e3bc14665facfa16b538287b6699613" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.245220 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.255127 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.273285 4990 scope.go:117] "RemoveContainer" containerID="6a3e40c9f234efc7e30639529a1fbbfc4416936741e319dfafc29cae513a52f4" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.278347 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:38 crc kubenswrapper[4990]: E1003 11:32:38.278815 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="ceilometer-notification-agent" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.278827 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="ceilometer-notification-agent" Oct 03 11:32:38 crc kubenswrapper[4990]: E1003 11:32:38.278853 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="proxy-httpd" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.278860 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="proxy-httpd" Oct 03 11:32:38 crc kubenswrapper[4990]: E1003 11:32:38.278889 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="ceilometer-central-agent" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.278895 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="ceilometer-central-agent" Oct 03 11:32:38 crc kubenswrapper[4990]: E1003 11:32:38.278907 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="sg-core" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.278913 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="sg-core" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.279120 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="proxy-httpd" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.279131 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="ceilometer-notification-agent" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.279143 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="sg-core" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.279159 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="51080cd0-384c-4254-a8e2-6524d8848708" containerName="ceilometer-central-agent" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.280977 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.288861 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.289627 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.291762 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.312780 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.327232 4990 scope.go:117] "RemoveContainer" containerID="d16c6c3bdfcd1ab0ea2bc6c88f1b5ee61613eea0d97f3d93714b997c7083039f" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.346123 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.346236 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-config-data\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.346268 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-scripts\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.346301 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.346413 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-log-httpd\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.351390 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-run-httpd\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.351466 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtc9\" (UniqueName: \"kubernetes.io/projected/5b814402-6995-4051-8a76-54c7a2adcf9c-kube-api-access-rhtc9\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.351632 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.453245 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.453323 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-config-data\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.453348 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-scripts\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.453373 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.453434 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-log-httpd\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.453463 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-run-httpd\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.453488 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtc9\" (UniqueName: \"kubernetes.io/projected/5b814402-6995-4051-8a76-54c7a2adcf9c-kube-api-access-rhtc9\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.453539 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.457350 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.457487 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.457596 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-run-httpd\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.457783 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-log-httpd\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.459760 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.460212 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-scripts\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.462204 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-config-data\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.478852 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtc9\" (UniqueName: \"kubernetes.io/projected/5b814402-6995-4051-8a76-54c7a2adcf9c-kube-api-access-rhtc9\") pod \"ceilometer-0\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.649154 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.890181 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51080cd0-384c-4254-a8e2-6524d8848708" path="/var/lib/kubelet/pods/51080cd0-384c-4254-a8e2-6524d8848708/volumes" Oct 03 11:32:38 crc kubenswrapper[4990]: I1003 11:32:38.891373 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2766af4-5f69-4efa-9f3d-0899a146114a" path="/var/lib/kubelet/pods/d2766af4-5f69-4efa-9f3d-0899a146114a/volumes" Oct 03 11:32:39 crc kubenswrapper[4990]: I1003 11:32:39.140260 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:39 crc kubenswrapper[4990]: I1003 11:32:39.219754 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerStarted","Data":"136d214c5a87b0791caba5453fdf2114147bfced3d876f30bf0fd4ab0ef29bca"} Oct 03 11:32:39 crc kubenswrapper[4990]: I1003 11:32:39.274439 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 03 11:32:39 crc kubenswrapper[4990]: I1003 11:32:39.794812 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:41 crc kubenswrapper[4990]: I1003 11:32:41.253086 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerStarted","Data":"1d74e6ab3994a2795ff30aacfe9467fe149635246614d779244ba796d29535aa"} Oct 03 11:32:41 crc kubenswrapper[4990]: I1003 11:32:41.255306 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerStarted","Data":"26cfe19c0a802fa7b6f2f928b6b941144676fd0b577cb926dc115f18ae336f0f"} Oct 03 11:32:42 crc kubenswrapper[4990]: I1003 11:32:42.270123 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerStarted","Data":"a54fba36f01b9730be0973150fa0ed20af2b7ecca163454516ceff3c9bfd6f28"} Oct 03 11:32:42 crc kubenswrapper[4990]: I1003 11:32:42.270776 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerStarted","Data":"8cc1d554e4d04c20d185e508ecadaff3e0677e5d1aac5185c7db36b259b6e84f"} Oct 03 11:32:42 crc kubenswrapper[4990]: I1003 11:32:42.275066 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerStarted","Data":"22b2a302a22993a84857abab7af7635daf78d8b8856705f6bd87f33123d07ed3"} Oct 03 11:32:42 crc kubenswrapper[4990]: I1003 11:32:42.541044 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.295187 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerStarted","Data":"ca9ec3ce422cba3ae0184ca6f2c1d04d37dafc4b48992aea6e41ec87c3df8713"} Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.295267 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-api" containerID="cri-o://a6ee09724490d173463fd7c31821660af33a15f0f61c3d53312677a8b47c4807" gracePeriod=30 Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.295316 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-notifier" containerID="cri-o://22b2a302a22993a84857abab7af7635daf78d8b8856705f6bd87f33123d07ed3" gracePeriod=30 Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.295332 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-evaluator" containerID="cri-o://1d74e6ab3994a2795ff30aacfe9467fe149635246614d779244ba796d29535aa" gracePeriod=30 Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.295334 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-listener" containerID="cri-o://ca9ec3ce422cba3ae0184ca6f2c1d04d37dafc4b48992aea6e41ec87c3df8713" gracePeriod=30 Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.300279 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerStarted","Data":"9d640585e01b3af53e374a86e40f8e3b9f8cfb258124cf453b064827e8e6e938"} Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.300379 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="ceilometer-central-agent" containerID="cri-o://26cfe19c0a802fa7b6f2f928b6b941144676fd0b577cb926dc115f18ae336f0f" gracePeriod=30 Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.300455 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="proxy-httpd" containerID="cri-o://9d640585e01b3af53e374a86e40f8e3b9f8cfb258124cf453b064827e8e6e938" gracePeriod=30 Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.300462 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.300499 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="sg-core" containerID="cri-o://a54fba36f01b9730be0973150fa0ed20af2b7ecca163454516ceff3c9bfd6f28" gracePeriod=30 Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.300596 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="ceilometer-notification-agent" containerID="cri-o://8cc1d554e4d04c20d185e508ecadaff3e0677e5d1aac5185c7db36b259b6e84f" gracePeriod=30 Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.326741 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.287460226 podStartE2EDuration="9.326728225s" podCreationTimestamp="2025-10-03 11:32:35 +0000 UTC" firstStartedPulling="2025-10-03 11:32:36.580437024 +0000 UTC m=+6538.377068881" lastFinishedPulling="2025-10-03 11:32:43.619705033 +0000 UTC m=+6545.416336880" observedRunningTime="2025-10-03 11:32:44.322367163 +0000 UTC m=+6546.118999020" watchObservedRunningTime="2025-10-03 11:32:44.326728225 +0000 UTC m=+6546.123360082" Oct 03 11:32:44 crc kubenswrapper[4990]: I1003 11:32:44.349341 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8869075560000002 podStartE2EDuration="6.349321667s" podCreationTimestamp="2025-10-03 11:32:38 +0000 UTC" firstStartedPulling="2025-10-03 11:32:39.155431884 +0000 UTC m=+6540.952063741" lastFinishedPulling="2025-10-03 11:32:43.617845985 +0000 UTC m=+6545.414477852" observedRunningTime="2025-10-03 11:32:44.346361191 +0000 UTC m=+6546.142993048" watchObservedRunningTime="2025-10-03 11:32:44.349321667 +0000 UTC m=+6546.145953524" Oct 03 11:32:44 crc kubenswrapper[4990]: E1003 11:32:44.761406 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b814402_6995_4051_8a76_54c7a2adcf9c.slice/crio-conmon-8cc1d554e4d04c20d185e508ecadaff3e0677e5d1aac5185c7db36b259b6e84f.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.319993 4990 generic.go:334] "Generic (PLEG): container finished" podID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerID="9d640585e01b3af53e374a86e40f8e3b9f8cfb258124cf453b064827e8e6e938" exitCode=0 Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.320038 4990 generic.go:334] "Generic (PLEG): container finished" podID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerID="a54fba36f01b9730be0973150fa0ed20af2b7ecca163454516ceff3c9bfd6f28" exitCode=2 Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.320048 4990 generic.go:334] "Generic (PLEG): container finished" podID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerID="8cc1d554e4d04c20d185e508ecadaff3e0677e5d1aac5185c7db36b259b6e84f" exitCode=0 Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.320100 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerDied","Data":"9d640585e01b3af53e374a86e40f8e3b9f8cfb258124cf453b064827e8e6e938"} Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.320135 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerDied","Data":"a54fba36f01b9730be0973150fa0ed20af2b7ecca163454516ceff3c9bfd6f28"} Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.320149 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerDied","Data":"8cc1d554e4d04c20d185e508ecadaff3e0677e5d1aac5185c7db36b259b6e84f"} Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.322721 4990 generic.go:334] "Generic (PLEG): container finished" podID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerID="22b2a302a22993a84857abab7af7635daf78d8b8856705f6bd87f33123d07ed3" exitCode=0 Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.322752 4990 generic.go:334] "Generic (PLEG): container finished" podID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerID="1d74e6ab3994a2795ff30aacfe9467fe149635246614d779244ba796d29535aa" exitCode=0 Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.322761 4990 generic.go:334] "Generic (PLEG): container finished" podID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerID="a6ee09724490d173463fd7c31821660af33a15f0f61c3d53312677a8b47c4807" exitCode=0 Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.322781 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerDied","Data":"22b2a302a22993a84857abab7af7635daf78d8b8856705f6bd87f33123d07ed3"} Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.322806 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerDied","Data":"1d74e6ab3994a2795ff30aacfe9467fe149635246614d779244ba796d29535aa"} Oct 03 11:32:45 crc kubenswrapper[4990]: I1003 11:32:45.322815 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerDied","Data":"a6ee09724490d173463fd7c31821660af33a15f0f61c3d53312677a8b47c4807"} Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.361396 4990 generic.go:334] "Generic (PLEG): container finished" podID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerID="26cfe19c0a802fa7b6f2f928b6b941144676fd0b577cb926dc115f18ae336f0f" exitCode=0 Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.361498 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerDied","Data":"26cfe19c0a802fa7b6f2f928b6b941144676fd0b577cb926dc115f18ae336f0f"} Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.532360 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.675139 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-ceilometer-tls-certs\") pod \"5b814402-6995-4051-8a76-54c7a2adcf9c\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.675228 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-sg-core-conf-yaml\") pod \"5b814402-6995-4051-8a76-54c7a2adcf9c\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.675319 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-run-httpd\") pod \"5b814402-6995-4051-8a76-54c7a2adcf9c\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.675379 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhtc9\" (UniqueName: \"kubernetes.io/projected/5b814402-6995-4051-8a76-54c7a2adcf9c-kube-api-access-rhtc9\") pod \"5b814402-6995-4051-8a76-54c7a2adcf9c\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.675399 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-log-httpd\") pod \"5b814402-6995-4051-8a76-54c7a2adcf9c\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.675418 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-combined-ca-bundle\") pod \"5b814402-6995-4051-8a76-54c7a2adcf9c\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.675466 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-config-data\") pod \"5b814402-6995-4051-8a76-54c7a2adcf9c\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.675482 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-scripts\") pod \"5b814402-6995-4051-8a76-54c7a2adcf9c\" (UID: \"5b814402-6995-4051-8a76-54c7a2adcf9c\") " Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.676068 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5b814402-6995-4051-8a76-54c7a2adcf9c" (UID: "5b814402-6995-4051-8a76-54c7a2adcf9c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.676077 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5b814402-6995-4051-8a76-54c7a2adcf9c" (UID: "5b814402-6995-4051-8a76-54c7a2adcf9c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.680320 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b814402-6995-4051-8a76-54c7a2adcf9c-kube-api-access-rhtc9" (OuterVolumeSpecName: "kube-api-access-rhtc9") pod "5b814402-6995-4051-8a76-54c7a2adcf9c" (UID: "5b814402-6995-4051-8a76-54c7a2adcf9c"). InnerVolumeSpecName "kube-api-access-rhtc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.680598 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-scripts" (OuterVolumeSpecName: "scripts") pod "5b814402-6995-4051-8a76-54c7a2adcf9c" (UID: "5b814402-6995-4051-8a76-54c7a2adcf9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.720399 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5b814402-6995-4051-8a76-54c7a2adcf9c" (UID: "5b814402-6995-4051-8a76-54c7a2adcf9c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.733032 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5b814402-6995-4051-8a76-54c7a2adcf9c" (UID: "5b814402-6995-4051-8a76-54c7a2adcf9c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.777672 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhtc9\" (UniqueName: \"kubernetes.io/projected/5b814402-6995-4051-8a76-54c7a2adcf9c-kube-api-access-rhtc9\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.777701 4990 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.777712 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.777720 4990 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.777729 4990 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.777737 4990 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5b814402-6995-4051-8a76-54c7a2adcf9c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.784125 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b814402-6995-4051-8a76-54c7a2adcf9c" (UID: "5b814402-6995-4051-8a76-54c7a2adcf9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.792698 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-config-data" (OuterVolumeSpecName: "config-data") pod "5b814402-6995-4051-8a76-54c7a2adcf9c" (UID: "5b814402-6995-4051-8a76-54c7a2adcf9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.880908 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:48 crc kubenswrapper[4990]: I1003 11:32:48.880964 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b814402-6995-4051-8a76-54c7a2adcf9c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.383503 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5b814402-6995-4051-8a76-54c7a2adcf9c","Type":"ContainerDied","Data":"136d214c5a87b0791caba5453fdf2114147bfced3d876f30bf0fd4ab0ef29bca"} Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.383899 4990 scope.go:117] "RemoveContainer" containerID="9d640585e01b3af53e374a86e40f8e3b9f8cfb258124cf453b064827e8e6e938" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.383636 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.425135 4990 scope.go:117] "RemoveContainer" containerID="a54fba36f01b9730be0973150fa0ed20af2b7ecca163454516ceff3c9bfd6f28" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.428983 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.466993 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.486789 4990 scope.go:117] "RemoveContainer" containerID="8cc1d554e4d04c20d185e508ecadaff3e0677e5d1aac5185c7db36b259b6e84f" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.491168 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:49 crc kubenswrapper[4990]: E1003 11:32:49.491776 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="sg-core" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.491806 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="sg-core" Oct 03 11:32:49 crc kubenswrapper[4990]: E1003 11:32:49.491828 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="proxy-httpd" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.491838 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="proxy-httpd" Oct 03 11:32:49 crc kubenswrapper[4990]: E1003 11:32:49.491861 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="ceilometer-notification-agent" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.491870 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="ceilometer-notification-agent" Oct 03 11:32:49 crc kubenswrapper[4990]: E1003 11:32:49.491889 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="ceilometer-central-agent" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.491898 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="ceilometer-central-agent" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.492203 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="proxy-httpd" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.492219 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="ceilometer-notification-agent" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.492239 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="ceilometer-central-agent" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.492261 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" containerName="sg-core" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.494909 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.497788 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.498013 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.498009 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.504704 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.515246 4990 scope.go:117] "RemoveContainer" containerID="26cfe19c0a802fa7b6f2f928b6b941144676fd0b577cb926dc115f18ae336f0f" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.597478 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-log-httpd\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.597556 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-config-data\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.597594 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fcm9\" (UniqueName: \"kubernetes.io/projected/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-kube-api-access-2fcm9\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.597650 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-scripts\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.597714 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.597798 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-run-httpd\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.597845 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.598054 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.699778 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.699912 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.699992 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-log-httpd\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.700014 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-config-data\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.700041 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcm9\" (UniqueName: \"kubernetes.io/projected/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-kube-api-access-2fcm9\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.700064 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-scripts\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.700088 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.700103 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-run-httpd\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.700611 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-run-httpd\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.700776 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-log-httpd\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.705280 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-scripts\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.705751 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.708438 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-config-data\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.709868 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.717151 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.721316 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fcm9\" (UniqueName: \"kubernetes.io/projected/f6029224-b0b3-4ec3-a6b9-0745dc24f55c-kube-api-access-2fcm9\") pod \"ceilometer-0\" (UID: \"f6029224-b0b3-4ec3-a6b9-0745dc24f55c\") " pod="openstack/ceilometer-0" Oct 03 11:32:49 crc kubenswrapper[4990]: I1003 11:32:49.820194 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 11:32:50 crc kubenswrapper[4990]: I1003 11:32:50.300478 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 11:32:50 crc kubenswrapper[4990]: I1003 11:32:50.396975 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6029224-b0b3-4ec3-a6b9-0745dc24f55c","Type":"ContainerStarted","Data":"6fe53cbd1aa6bc65da2db09c846e2fb8fa633cc047734dfe460ba51ce4450968"} Oct 03 11:32:50 crc kubenswrapper[4990]: I1003 11:32:50.886122 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b814402-6995-4051-8a76-54c7a2adcf9c" path="/var/lib/kubelet/pods/5b814402-6995-4051-8a76-54c7a2adcf9c/volumes" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.412823 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6029224-b0b3-4ec3-a6b9-0745dc24f55c","Type":"ContainerStarted","Data":"397ffd457154559895a5168bc930828f45b77a68a5e3f1b7c7ec140caaa936ef"} Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.620197 4990 scope.go:117] "RemoveContainer" containerID="9848e116d9e47e5d81a6ab1ffee80ba68d47bf6bfa13e1c0f5c9f038e5923d88" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.648243 4990 scope.go:117] "RemoveContainer" containerID="deea82c013222624c66563b9ac58c134355070e19936c3007d70aad92ce980f8" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.723639 4990 scope.go:117] "RemoveContainer" containerID="a806383d40edd08d38ac187c96d26dabf2df2239893c2bb6f3f5fb4b05ff6592" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.801293 4990 scope.go:117] "RemoveContainer" containerID="8a5f22c8c6643b4bb53283d20653ee9ccf9d6937ca3d23dac2430e1f1aa05997" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.820699 4990 scope.go:117] "RemoveContainer" containerID="119a8884ce5af0cffd38ae3b5fb2fb18b1887228525d4979be825480f86d614a" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.849675 4990 scope.go:117] "RemoveContainer" containerID="259d004c8ffd58a92028951ef2bd6187492cfcae12954cd45c79af459b54cc13" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.872140 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:32:51 crc kubenswrapper[4990]: E1003 11:32:51.872463 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.893343 4990 scope.go:117] "RemoveContainer" containerID="b50ab588823a7afca54e2ade2be90f69add87860677827e6704600421d118bb9" Oct 03 11:32:51 crc kubenswrapper[4990]: I1003 11:32:51.931984 4990 scope.go:117] "RemoveContainer" containerID="2568fa02d03f14b960503812d7d2fb5ccea241948fbf916153b54587912a556f" Oct 03 11:32:52 crc kubenswrapper[4990]: I1003 11:32:52.000566 4990 scope.go:117] "RemoveContainer" containerID="318bc1cd1da7d59d2a0ce59a38d11f5b793b14747a103bd82edd8599fdba3b55" Oct 03 11:32:52 crc kubenswrapper[4990]: I1003 11:32:52.423001 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6029224-b0b3-4ec3-a6b9-0745dc24f55c","Type":"ContainerStarted","Data":"273a71f254978553f68c5294758e65dd2be2784cfffe6be2c885650d62dbc78b"} Oct 03 11:32:53 crc kubenswrapper[4990]: I1003 11:32:53.436485 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6029224-b0b3-4ec3-a6b9-0745dc24f55c","Type":"ContainerStarted","Data":"e2fad5f53eb286fc5c9614558018bde46c5633e0f2275917e429e2895404df1c"} Oct 03 11:32:54 crc kubenswrapper[4990]: I1003 11:32:54.451849 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6029224-b0b3-4ec3-a6b9-0745dc24f55c","Type":"ContainerStarted","Data":"63384b080815c93e7e1f2dc7c0b8f86bc7701f84b19c4349a52515a02d6662e6"} Oct 03 11:32:54 crc kubenswrapper[4990]: I1003 11:32:54.452195 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 11:32:54 crc kubenswrapper[4990]: I1003 11:32:54.475072 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.793456886 podStartE2EDuration="5.475048792s" podCreationTimestamp="2025-10-03 11:32:49 +0000 UTC" firstStartedPulling="2025-10-03 11:32:50.312164238 +0000 UTC m=+6552.108796095" lastFinishedPulling="2025-10-03 11:32:53.993756144 +0000 UTC m=+6555.790388001" observedRunningTime="2025-10-03 11:32:54.472495826 +0000 UTC m=+6556.269127673" watchObservedRunningTime="2025-10-03 11:32:54.475048792 +0000 UTC m=+6556.271680649" Oct 03 11:32:55 crc kubenswrapper[4990]: I1003 11:32:55.041446 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5c5lk"] Oct 03 11:32:55 crc kubenswrapper[4990]: I1003 11:32:55.050647 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5c5lk"] Oct 03 11:32:56 crc kubenswrapper[4990]: I1003 11:32:56.905839 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1" path="/var/lib/kubelet/pods/92d3ef26-a73d-4cdb-bab1-fc1b9a1300b1/volumes" Oct 03 11:33:05 crc kubenswrapper[4990]: I1003 11:33:05.872172 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:33:05 crc kubenswrapper[4990]: E1003 11:33:05.873243 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:33:14 crc kubenswrapper[4990]: I1003 11:33:14.680780 4990 generic.go:334] "Generic (PLEG): container finished" podID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerID="ca9ec3ce422cba3ae0184ca6f2c1d04d37dafc4b48992aea6e41ec87c3df8713" exitCode=137 Oct 03 11:33:14 crc kubenswrapper[4990]: I1003 11:33:14.681256 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerDied","Data":"ca9ec3ce422cba3ae0184ca6f2c1d04d37dafc4b48992aea6e41ec87c3df8713"} Oct 03 11:33:14 crc kubenswrapper[4990]: I1003 11:33:14.838220 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.024013 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-combined-ca-bundle\") pod \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.024059 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-scripts\") pod \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.024092 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-config-data\") pod \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.024218 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmtvt\" (UniqueName: \"kubernetes.io/projected/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-kube-api-access-pmtvt\") pod \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\" (UID: \"1d0aa6af-fb85-4131-9298-5bb5c65ff45d\") " Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.030572 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-scripts" (OuterVolumeSpecName: "scripts") pod "1d0aa6af-fb85-4131-9298-5bb5c65ff45d" (UID: "1d0aa6af-fb85-4131-9298-5bb5c65ff45d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.057742 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-kube-api-access-pmtvt" (OuterVolumeSpecName: "kube-api-access-pmtvt") pod "1d0aa6af-fb85-4131-9298-5bb5c65ff45d" (UID: "1d0aa6af-fb85-4131-9298-5bb5c65ff45d"). InnerVolumeSpecName "kube-api-access-pmtvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.127144 4990 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.127199 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmtvt\" (UniqueName: \"kubernetes.io/projected/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-kube-api-access-pmtvt\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.149867 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-config-data" (OuterVolumeSpecName: "config-data") pod "1d0aa6af-fb85-4131-9298-5bb5c65ff45d" (UID: "1d0aa6af-fb85-4131-9298-5bb5c65ff45d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.162605 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d0aa6af-fb85-4131-9298-5bb5c65ff45d" (UID: "1d0aa6af-fb85-4131-9298-5bb5c65ff45d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.228601 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.228628 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d0aa6af-fb85-4131-9298-5bb5c65ff45d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.695446 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"1d0aa6af-fb85-4131-9298-5bb5c65ff45d","Type":"ContainerDied","Data":"7bbfbd97ffbd89f2adcf70ec83ad016aba86418777ff74fa6675614fee8cd685"} Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.695730 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.695747 4990 scope.go:117] "RemoveContainer" containerID="ca9ec3ce422cba3ae0184ca6f2c1d04d37dafc4b48992aea6e41ec87c3df8713" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.729229 4990 scope.go:117] "RemoveContainer" containerID="22b2a302a22993a84857abab7af7635daf78d8b8856705f6bd87f33123d07ed3" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.734579 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.767570 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.773960 4990 scope.go:117] "RemoveContainer" containerID="1d74e6ab3994a2795ff30aacfe9467fe149635246614d779244ba796d29535aa" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.777997 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 03 11:33:15 crc kubenswrapper[4990]: E1003 11:33:15.778492 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-evaluator" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.778525 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-evaluator" Oct 03 11:33:15 crc kubenswrapper[4990]: E1003 11:33:15.778553 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-api" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.778561 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-api" Oct 03 11:33:15 crc kubenswrapper[4990]: E1003 11:33:15.778586 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-notifier" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.778594 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-notifier" Oct 03 11:33:15 crc kubenswrapper[4990]: E1003 11:33:15.778618 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-listener" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.778626 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-listener" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.778876 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-api" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.778902 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-listener" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.778922 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-notifier" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.778938 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" containerName="aodh-evaluator" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.781369 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.784476 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.785344 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.785542 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.785080 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.785883 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fpzvx" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.795761 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.820756 4990 scope.go:117] "RemoveContainer" containerID="a6ee09724490d173463fd7c31821660af33a15f0f61c3d53312677a8b47c4807" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.943324 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfw99\" (UniqueName: \"kubernetes.io/projected/6cc089cc-aaba-4402-875d-c0abdd5a051e-kube-api-access-gfw99\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.943440 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-public-tls-certs\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.944723 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-internal-tls-certs\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.944880 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.944901 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-scripts\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:15 crc kubenswrapper[4990]: I1003 11:33:15.944927 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-config-data\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.046411 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfw99\" (UniqueName: \"kubernetes.io/projected/6cc089cc-aaba-4402-875d-c0abdd5a051e-kube-api-access-gfw99\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.046859 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-public-tls-certs\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.047624 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-internal-tls-certs\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.047800 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.047825 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-scripts\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.048077 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-config-data\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.052184 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-scripts\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.053044 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-config-data\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.053270 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-internal-tls-certs\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.055341 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.055761 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cc089cc-aaba-4402-875d-c0abdd5a051e-public-tls-certs\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.064802 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfw99\" (UniqueName: \"kubernetes.io/projected/6cc089cc-aaba-4402-875d-c0abdd5a051e-kube-api-access-gfw99\") pod \"aodh-0\" (UID: \"6cc089cc-aaba-4402-875d-c0abdd5a051e\") " pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.116244 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.559249 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.706880 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cc089cc-aaba-4402-875d-c0abdd5a051e","Type":"ContainerStarted","Data":"486bfa39eef9da8715cd2e3ca2abf7ec23cf472c08664b40288d302af9b7b6d2"} Oct 03 11:33:16 crc kubenswrapper[4990]: I1003 11:33:16.883719 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d0aa6af-fb85-4131-9298-5bb5c65ff45d" path="/var/lib/kubelet/pods/1d0aa6af-fb85-4131-9298-5bb5c65ff45d/volumes" Oct 03 11:33:17 crc kubenswrapper[4990]: I1003 11:33:17.721452 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cc089cc-aaba-4402-875d-c0abdd5a051e","Type":"ContainerStarted","Data":"5f1814cfa57fd6f083fbfe855b36178788bcb6a9cb05c45481e47d9dc9ba2cbe"} Oct 03 11:33:17 crc kubenswrapper[4990]: I1003 11:33:17.872177 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:33:17 crc kubenswrapper[4990]: E1003 11:33:17.872991 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:33:18 crc kubenswrapper[4990]: I1003 11:33:18.733237 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cc089cc-aaba-4402-875d-c0abdd5a051e","Type":"ContainerStarted","Data":"a0596fa319ed119d5351d996cd7806e0669253a7be38a9cc5ec379343868391e"} Oct 03 11:33:19 crc kubenswrapper[4990]: I1003 11:33:19.744678 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cc089cc-aaba-4402-875d-c0abdd5a051e","Type":"ContainerStarted","Data":"7be67fe9f38a8123665d1bbc2e788af1b01f3a2f05ea5cb5c99a243dedaab3b6"} Oct 03 11:33:19 crc kubenswrapper[4990]: I1003 11:33:19.745204 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cc089cc-aaba-4402-875d-c0abdd5a051e","Type":"ContainerStarted","Data":"901b84f06c3b37bb3b3f8d069127e8f947c4eadd4be828332c42d23f44b0f013"} Oct 03 11:33:19 crc kubenswrapper[4990]: I1003 11:33:19.789717 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.969161144 podStartE2EDuration="4.789696639s" podCreationTimestamp="2025-10-03 11:33:15 +0000 UTC" firstStartedPulling="2025-10-03 11:33:16.566610704 +0000 UTC m=+6578.363242561" lastFinishedPulling="2025-10-03 11:33:19.387146209 +0000 UTC m=+6581.183778056" observedRunningTime="2025-10-03 11:33:19.781860097 +0000 UTC m=+6581.578491954" watchObservedRunningTime="2025-10-03 11:33:19.789696639 +0000 UTC m=+6581.586328486" Oct 03 11:33:19 crc kubenswrapper[4990]: I1003 11:33:19.972457 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.253769 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-579dddf88f-cglhg"] Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.257819 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.260780 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.285028 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579dddf88f-cglhg"] Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.397862 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-nb\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.397930 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6k9b\" (UniqueName: \"kubernetes.io/projected/7f44cdd6-f8e3-479d-a81d-0827c4aef504-kube-api-access-d6k9b\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.397966 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-config\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.398070 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-openstack-cell1\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.398200 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-dns-svc\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.398248 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-sb\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.499822 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-nb\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.499909 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6k9b\" (UniqueName: \"kubernetes.io/projected/7f44cdd6-f8e3-479d-a81d-0827c4aef504-kube-api-access-d6k9b\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.499944 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-config\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.500020 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-openstack-cell1\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.500067 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-dns-svc\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.500095 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-sb\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.501075 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-sb\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.501605 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-config\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.501772 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-openstack-cell1\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.502150 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-nb\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.502439 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-dns-svc\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.520807 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6k9b\" (UniqueName: \"kubernetes.io/projected/7f44cdd6-f8e3-479d-a81d-0827c4aef504-kube-api-access-d6k9b\") pod \"dnsmasq-dns-579dddf88f-cglhg\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:22 crc kubenswrapper[4990]: I1003 11:33:22.579112 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:23 crc kubenswrapper[4990]: W1003 11:33:23.101633 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f44cdd6_f8e3_479d_a81d_0827c4aef504.slice/crio-00fb127566babe70406f57d528213f17b3b6bc048abd61e8e2bab26b75370d5c WatchSource:0}: Error finding container 00fb127566babe70406f57d528213f17b3b6bc048abd61e8e2bab26b75370d5c: Status 404 returned error can't find the container with id 00fb127566babe70406f57d528213f17b3b6bc048abd61e8e2bab26b75370d5c Oct 03 11:33:23 crc kubenswrapper[4990]: I1003 11:33:23.104499 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579dddf88f-cglhg"] Oct 03 11:33:23 crc kubenswrapper[4990]: I1003 11:33:23.791787 4990 generic.go:334] "Generic (PLEG): container finished" podID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" containerID="81e0bcb214bfda1a5f64b9257db50cd9bbee7a6e6d1388f76a0429709fc828fb" exitCode=0 Oct 03 11:33:23 crc kubenswrapper[4990]: I1003 11:33:23.791876 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" event={"ID":"7f44cdd6-f8e3-479d-a81d-0827c4aef504","Type":"ContainerDied","Data":"81e0bcb214bfda1a5f64b9257db50cd9bbee7a6e6d1388f76a0429709fc828fb"} Oct 03 11:33:23 crc kubenswrapper[4990]: I1003 11:33:23.793944 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" event={"ID":"7f44cdd6-f8e3-479d-a81d-0827c4aef504","Type":"ContainerStarted","Data":"00fb127566babe70406f57d528213f17b3b6bc048abd61e8e2bab26b75370d5c"} Oct 03 11:33:24 crc kubenswrapper[4990]: I1003 11:33:24.812980 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" event={"ID":"7f44cdd6-f8e3-479d-a81d-0827c4aef504","Type":"ContainerStarted","Data":"2adde44c0b58fab7053781bea87807693933a6d9f757062e26d4a8d0245178e3"} Oct 03 11:33:24 crc kubenswrapper[4990]: I1003 11:33:24.813284 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:24 crc kubenswrapper[4990]: I1003 11:33:24.851057 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" podStartSLOduration=2.851030668 podStartE2EDuration="2.851030668s" podCreationTimestamp="2025-10-03 11:33:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:33:24.842925139 +0000 UTC m=+6586.639557016" watchObservedRunningTime="2025-10-03 11:33:24.851030668 +0000 UTC m=+6586.647662535" Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.580821 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.645963 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db566b797-8xnmw"] Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.646260 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db566b797-8xnmw" podUID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" containerName="dnsmasq-dns" containerID="cri-o://1d6a59ed9a8e5356176c7f83abd61c4f574da44da3f0207502e1d0bad11ddb2a" gracePeriod=10 Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.834157 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bb556d4d7-dhjcp"] Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.837780 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.861482 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb556d4d7-dhjcp"] Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.872093 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:33:32 crc kubenswrapper[4990]: E1003 11:33:32.872352 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.893303 4990 generic.go:334] "Generic (PLEG): container finished" podID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" containerID="1d6a59ed9a8e5356176c7f83abd61c4f574da44da3f0207502e1d0bad11ddb2a" exitCode=0 Oct 03 11:33:32 crc kubenswrapper[4990]: I1003 11:33:32.893343 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db566b797-8xnmw" event={"ID":"5cc2b167-7f62-4c56-9654-58ea4cb892cf","Type":"ContainerDied","Data":"1d6a59ed9a8e5356176c7f83abd61c4f574da44da3f0207502e1d0bad11ddb2a"} Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.043009 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-dns-svc\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.043086 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.045068 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-openstack-cell1\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.045440 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.045539 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmksv\" (UniqueName: \"kubernetes.io/projected/7e7e4c39-9c12-4a92-8bd0-842048d14be5-kube-api-access-bmksv\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.045775 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-config\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.147882 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-config\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.148002 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-dns-svc\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.148041 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.148070 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-openstack-cell1\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.148153 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.148204 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmksv\" (UniqueName: \"kubernetes.io/projected/7e7e4c39-9c12-4a92-8bd0-842048d14be5-kube-api-access-bmksv\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.149825 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.150506 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-openstack-cell1\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.150859 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-config\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.151371 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-dns-svc\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.151808 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e7e4c39-9c12-4a92-8bd0-842048d14be5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.168463 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmksv\" (UniqueName: \"kubernetes.io/projected/7e7e4c39-9c12-4a92-8bd0-842048d14be5-kube-api-access-bmksv\") pod \"dnsmasq-dns-5bb556d4d7-dhjcp\" (UID: \"7e7e4c39-9c12-4a92-8bd0-842048d14be5\") " pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.240307 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.352067 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-config\") pod \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.352288 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-nb\") pod \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.352369 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vphq\" (UniqueName: \"kubernetes.io/projected/5cc2b167-7f62-4c56-9654-58ea4cb892cf-kube-api-access-6vphq\") pod \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.352409 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-dns-svc\") pod \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.352592 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-sb\") pod \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\" (UID: \"5cc2b167-7f62-4c56-9654-58ea4cb892cf\") " Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.358770 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc2b167-7f62-4c56-9654-58ea4cb892cf-kube-api-access-6vphq" (OuterVolumeSpecName: "kube-api-access-6vphq") pod "5cc2b167-7f62-4c56-9654-58ea4cb892cf" (UID: "5cc2b167-7f62-4c56-9654-58ea4cb892cf"). InnerVolumeSpecName "kube-api-access-6vphq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.412166 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5cc2b167-7f62-4c56-9654-58ea4cb892cf" (UID: "5cc2b167-7f62-4c56-9654-58ea4cb892cf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.427067 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5cc2b167-7f62-4c56-9654-58ea4cb892cf" (UID: "5cc2b167-7f62-4c56-9654-58ea4cb892cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.431289 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5cc2b167-7f62-4c56-9654-58ea4cb892cf" (UID: "5cc2b167-7f62-4c56-9654-58ea4cb892cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.433612 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-config" (OuterVolumeSpecName: "config") pod "5cc2b167-7f62-4c56-9654-58ea4cb892cf" (UID: "5cc2b167-7f62-4c56-9654-58ea4cb892cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.456825 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.456865 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.456879 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vphq\" (UniqueName: \"kubernetes.io/projected/5cc2b167-7f62-4c56-9654-58ea4cb892cf-kube-api-access-6vphq\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.456896 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.456906 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2b167-7f62-4c56-9654-58ea4cb892cf-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.463655 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.921060 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db566b797-8xnmw" event={"ID":"5cc2b167-7f62-4c56-9654-58ea4cb892cf","Type":"ContainerDied","Data":"b52af6258ac3524ddce29ed4dae979a24c845cc47cce06fc51e701c80c00c320"} Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.921330 4990 scope.go:117] "RemoveContainer" containerID="1d6a59ed9a8e5356176c7f83abd61c4f574da44da3f0207502e1d0bad11ddb2a" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.921122 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db566b797-8xnmw" Oct 03 11:33:33 crc kubenswrapper[4990]: I1003 11:33:33.972821 4990 scope.go:117] "RemoveContainer" containerID="d9044de82add60e2e5438d5d5b644b628128a2f5bfe290e6b3832ce24696b46f" Oct 03 11:33:34 crc kubenswrapper[4990]: I1003 11:33:34.028671 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db566b797-8xnmw"] Oct 03 11:33:34 crc kubenswrapper[4990]: I1003 11:33:34.049558 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db566b797-8xnmw"] Oct 03 11:33:34 crc kubenswrapper[4990]: I1003 11:33:34.069726 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb556d4d7-dhjcp"] Oct 03 11:33:34 crc kubenswrapper[4990]: I1003 11:33:34.884897 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" path="/var/lib/kubelet/pods/5cc2b167-7f62-4c56-9654-58ea4cb892cf/volumes" Oct 03 11:33:34 crc kubenswrapper[4990]: I1003 11:33:34.933881 4990 generic.go:334] "Generic (PLEG): container finished" podID="7e7e4c39-9c12-4a92-8bd0-842048d14be5" containerID="7a0c3f96fdee0f18c559d2e1ed65e8cf5701cfb5171e4313fa92a162d0902809" exitCode=0 Oct 03 11:33:34 crc kubenswrapper[4990]: I1003 11:33:34.933930 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" event={"ID":"7e7e4c39-9c12-4a92-8bd0-842048d14be5","Type":"ContainerDied","Data":"7a0c3f96fdee0f18c559d2e1ed65e8cf5701cfb5171e4313fa92a162d0902809"} Oct 03 11:33:34 crc kubenswrapper[4990]: I1003 11:33:34.935196 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" event={"ID":"7e7e4c39-9c12-4a92-8bd0-842048d14be5","Type":"ContainerStarted","Data":"a4222f7703a42bef34e93e4ee6a02e369503e6fd30e8cb12734471c7dc372654"} Oct 03 11:33:35 crc kubenswrapper[4990]: I1003 11:33:35.944994 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" event={"ID":"7e7e4c39-9c12-4a92-8bd0-842048d14be5","Type":"ContainerStarted","Data":"33a67343bed4bc74ff31bc247000c4617e805a258271342f05c686c5137cde39"} Oct 03 11:33:35 crc kubenswrapper[4990]: I1003 11:33:35.945372 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:35 crc kubenswrapper[4990]: I1003 11:33:35.965887 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" podStartSLOduration=3.965868221 podStartE2EDuration="3.965868221s" podCreationTimestamp="2025-10-03 11:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:33:35.964166627 +0000 UTC m=+6597.760798484" watchObservedRunningTime="2025-10-03 11:33:35.965868221 +0000 UTC m=+6597.762500078" Oct 03 11:33:43 crc kubenswrapper[4990]: I1003 11:33:43.465740 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bb556d4d7-dhjcp" Oct 03 11:33:43 crc kubenswrapper[4990]: I1003 11:33:43.536088 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-579dddf88f-cglhg"] Oct 03 11:33:43 crc kubenswrapper[4990]: I1003 11:33:43.536348 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" podUID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" containerName="dnsmasq-dns" containerID="cri-o://2adde44c0b58fab7053781bea87807693933a6d9f757062e26d4a8d0245178e3" gracePeriod=10 Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.041504 4990 generic.go:334] "Generic (PLEG): container finished" podID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" containerID="2adde44c0b58fab7053781bea87807693933a6d9f757062e26d4a8d0245178e3" exitCode=0 Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.041550 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" event={"ID":"7f44cdd6-f8e3-479d-a81d-0827c4aef504","Type":"ContainerDied","Data":"2adde44c0b58fab7053781bea87807693933a6d9f757062e26d4a8d0245178e3"} Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.041859 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" event={"ID":"7f44cdd6-f8e3-479d-a81d-0827c4aef504","Type":"ContainerDied","Data":"00fb127566babe70406f57d528213f17b3b6bc048abd61e8e2bab26b75370d5c"} Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.041874 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00fb127566babe70406f57d528213f17b3b6bc048abd61e8e2bab26b75370d5c" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.096032 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.300124 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-openstack-cell1\") pod \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.300199 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-config\") pod \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.300225 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-dns-svc\") pod \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.300258 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6k9b\" (UniqueName: \"kubernetes.io/projected/7f44cdd6-f8e3-479d-a81d-0827c4aef504-kube-api-access-d6k9b\") pod \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.300299 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-sb\") pod \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.300335 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-nb\") pod \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\" (UID: \"7f44cdd6-f8e3-479d-a81d-0827c4aef504\") " Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.311809 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f44cdd6-f8e3-479d-a81d-0827c4aef504-kube-api-access-d6k9b" (OuterVolumeSpecName: "kube-api-access-d6k9b") pod "7f44cdd6-f8e3-479d-a81d-0827c4aef504" (UID: "7f44cdd6-f8e3-479d-a81d-0827c4aef504"). InnerVolumeSpecName "kube-api-access-d6k9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.377318 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f44cdd6-f8e3-479d-a81d-0827c4aef504" (UID: "7f44cdd6-f8e3-479d-a81d-0827c4aef504"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.377370 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f44cdd6-f8e3-479d-a81d-0827c4aef504" (UID: "7f44cdd6-f8e3-479d-a81d-0827c4aef504"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.379008 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f44cdd6-f8e3-479d-a81d-0827c4aef504" (UID: "7f44cdd6-f8e3-479d-a81d-0827c4aef504"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.384440 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-config" (OuterVolumeSpecName: "config") pod "7f44cdd6-f8e3-479d-a81d-0827c4aef504" (UID: "7f44cdd6-f8e3-479d-a81d-0827c4aef504"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.403642 4990 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-config\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.403684 4990 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.403698 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6k9b\" (UniqueName: \"kubernetes.io/projected/7f44cdd6-f8e3-479d-a81d-0827c4aef504-kube-api-access-d6k9b\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.403715 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.403727 4990 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.411432 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "7f44cdd6-f8e3-479d-a81d-0827c4aef504" (UID: "7f44cdd6-f8e3-479d-a81d-0827c4aef504"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:33:44 crc kubenswrapper[4990]: I1003 11:33:44.505354 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7f44cdd6-f8e3-479d-a81d-0827c4aef504-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 03 11:33:45 crc kubenswrapper[4990]: I1003 11:33:45.049021 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579dddf88f-cglhg" Oct 03 11:33:45 crc kubenswrapper[4990]: I1003 11:33:45.070003 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-579dddf88f-cglhg"] Oct 03 11:33:45 crc kubenswrapper[4990]: I1003 11:33:45.079861 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-579dddf88f-cglhg"] Oct 03 11:33:45 crc kubenswrapper[4990]: I1003 11:33:45.872135 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:33:45 crc kubenswrapper[4990]: E1003 11:33:45.873548 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:33:46 crc kubenswrapper[4990]: I1003 11:33:46.888621 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" path="/var/lib/kubelet/pods/7f44cdd6-f8e3-479d-a81d-0827c4aef504/volumes" Oct 03 11:33:52 crc kubenswrapper[4990]: I1003 11:33:52.399628 4990 scope.go:117] "RemoveContainer" containerID="4bbfd6d2beded0d50b2894277f97d34264bca25424bd54c7a58dd5c32237b2be" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.589622 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn"] Oct 03 11:33:53 crc kubenswrapper[4990]: E1003 11:33:53.590666 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" containerName="dnsmasq-dns" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.590686 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" containerName="dnsmasq-dns" Oct 03 11:33:53 crc kubenswrapper[4990]: E1003 11:33:53.590705 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" containerName="dnsmasq-dns" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.590712 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" containerName="dnsmasq-dns" Oct 03 11:33:53 crc kubenswrapper[4990]: E1003 11:33:53.590747 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" containerName="init" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.590756 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" containerName="init" Oct 03 11:33:53 crc kubenswrapper[4990]: E1003 11:33:53.590793 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" containerName="init" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.590801 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" containerName="init" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.591048 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f44cdd6-f8e3-479d-a81d-0827c4aef504" containerName="dnsmasq-dns" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.591077 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc2b167-7f62-4c56-9654-58ea4cb892cf" containerName="dnsmasq-dns" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.591894 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.594071 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.594341 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.594585 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.594705 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.627269 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn"] Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.731398 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.731859 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.732027 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2rdp\" (UniqueName: \"kubernetes.io/projected/b2ba5395-b429-4f21-9cdb-6845ab7b1716-kube-api-access-c2rdp\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.732068 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.834524 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.834650 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.834738 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2rdp\" (UniqueName: \"kubernetes.io/projected/b2ba5395-b429-4f21-9cdb-6845ab7b1716-kube-api-access-c2rdp\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.834762 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.841443 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.841475 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.845975 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.862281 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2rdp\" (UniqueName: \"kubernetes.io/projected/b2ba5395-b429-4f21-9cdb-6845ab7b1716-kube-api-access-c2rdp\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:53 crc kubenswrapper[4990]: I1003 11:33:53.916286 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:33:54 crc kubenswrapper[4990]: I1003 11:33:54.693755 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn"] Oct 03 11:33:55 crc kubenswrapper[4990]: I1003 11:33:55.183034 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" event={"ID":"b2ba5395-b429-4f21-9cdb-6845ab7b1716","Type":"ContainerStarted","Data":"33dd1580afc23cbc74c32e01bd17ed1337148f87a2e535520705ad1ea146be12"} Oct 03 11:34:00 crc kubenswrapper[4990]: I1003 11:34:00.875754 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:34:00 crc kubenswrapper[4990]: E1003 11:34:00.876682 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:34:04 crc kubenswrapper[4990]: I1003 11:34:04.271476 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" event={"ID":"b2ba5395-b429-4f21-9cdb-6845ab7b1716","Type":"ContainerStarted","Data":"1a391bd8ca0254614d964708a41995ba461335dcb35158dc9b0b8dbf4b11d2a4"} Oct 03 11:34:04 crc kubenswrapper[4990]: I1003 11:34:04.294522 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" podStartSLOduration=2.151042614 podStartE2EDuration="11.294488836s" podCreationTimestamp="2025-10-03 11:33:53 +0000 UTC" firstStartedPulling="2025-10-03 11:33:54.705410565 +0000 UTC m=+6616.502042422" lastFinishedPulling="2025-10-03 11:34:03.848856787 +0000 UTC m=+6625.645488644" observedRunningTime="2025-10-03 11:34:04.291497849 +0000 UTC m=+6626.088129706" watchObservedRunningTime="2025-10-03 11:34:04.294488836 +0000 UTC m=+6626.091120693" Oct 03 11:34:12 crc kubenswrapper[4990]: I1003 11:34:12.872579 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:34:12 crc kubenswrapper[4990]: E1003 11:34:12.873874 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:34:17 crc kubenswrapper[4990]: I1003 11:34:17.419599 4990 generic.go:334] "Generic (PLEG): container finished" podID="b2ba5395-b429-4f21-9cdb-6845ab7b1716" containerID="1a391bd8ca0254614d964708a41995ba461335dcb35158dc9b0b8dbf4b11d2a4" exitCode=0 Oct 03 11:34:17 crc kubenswrapper[4990]: I1003 11:34:17.419666 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" event={"ID":"b2ba5395-b429-4f21-9cdb-6845ab7b1716","Type":"ContainerDied","Data":"1a391bd8ca0254614d964708a41995ba461335dcb35158dc9b0b8dbf4b11d2a4"} Oct 03 11:34:17 crc kubenswrapper[4990]: E1003 11:34:17.465192 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2ba5395_b429_4f21_9cdb_6845ab7b1716.slice/crio-1a391bd8ca0254614d964708a41995ba461335dcb35158dc9b0b8dbf4b11d2a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2ba5395_b429_4f21_9cdb_6845ab7b1716.slice/crio-conmon-1a391bd8ca0254614d964708a41995ba461335dcb35158dc9b0b8dbf4b11d2a4.scope\": RecentStats: unable to find data in memory cache]" Oct 03 11:34:18 crc kubenswrapper[4990]: I1003 11:34:18.918941 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.006306 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-ssh-key\") pod \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.006404 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-pre-adoption-validation-combined-ca-bundle\") pod \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.006532 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-inventory\") pod \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.006621 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2rdp\" (UniqueName: \"kubernetes.io/projected/b2ba5395-b429-4f21-9cdb-6845ab7b1716-kube-api-access-c2rdp\") pod \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\" (UID: \"b2ba5395-b429-4f21-9cdb-6845ab7b1716\") " Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.011999 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "b2ba5395-b429-4f21-9cdb-6845ab7b1716" (UID: "b2ba5395-b429-4f21-9cdb-6845ab7b1716"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.013454 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ba5395-b429-4f21-9cdb-6845ab7b1716-kube-api-access-c2rdp" (OuterVolumeSpecName: "kube-api-access-c2rdp") pod "b2ba5395-b429-4f21-9cdb-6845ab7b1716" (UID: "b2ba5395-b429-4f21-9cdb-6845ab7b1716"). InnerVolumeSpecName "kube-api-access-c2rdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.038489 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-inventory" (OuterVolumeSpecName: "inventory") pod "b2ba5395-b429-4f21-9cdb-6845ab7b1716" (UID: "b2ba5395-b429-4f21-9cdb-6845ab7b1716"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.039361 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b2ba5395-b429-4f21-9cdb-6845ab7b1716" (UID: "b2ba5395-b429-4f21-9cdb-6845ab7b1716"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.109862 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2rdp\" (UniqueName: \"kubernetes.io/projected/b2ba5395-b429-4f21-9cdb-6845ab7b1716-kube-api-access-c2rdp\") on node \"crc\" DevicePath \"\"" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.109910 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.109922 4990 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.109940 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b2ba5395-b429-4f21-9cdb-6845ab7b1716-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.440488 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" event={"ID":"b2ba5395-b429-4f21-9cdb-6845ab7b1716","Type":"ContainerDied","Data":"33dd1580afc23cbc74c32e01bd17ed1337148f87a2e535520705ad1ea146be12"} Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.440552 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn" Oct 03 11:34:19 crc kubenswrapper[4990]: I1003 11:34:19.440559 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33dd1580afc23cbc74c32e01bd17ed1337148f87a2e535520705ad1ea146be12" Oct 03 11:34:25 crc kubenswrapper[4990]: I1003 11:34:25.052340 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-v68xc"] Oct 03 11:34:25 crc kubenswrapper[4990]: I1003 11:34:25.068093 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-v68xc"] Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.403484 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg"] Oct 03 11:34:26 crc kubenswrapper[4990]: E1003 11:34:26.404041 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ba5395-b429-4f21-9cdb-6845ab7b1716" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.404058 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ba5395-b429-4f21-9cdb-6845ab7b1716" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.404429 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ba5395-b429-4f21-9cdb-6845ab7b1716" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.405375 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.408763 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.410255 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.410352 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.411129 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.419339 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg"] Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.581057 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.581670 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt2fb\" (UniqueName: \"kubernetes.io/projected/2df99b4a-9b37-4525-923f-469c5d607ce9-kube-api-access-rt2fb\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.581746 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.581856 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.684485 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt2fb\" (UniqueName: \"kubernetes.io/projected/2df99b4a-9b37-4525-923f-469c5d607ce9-kube-api-access-rt2fb\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.684628 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.684721 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.684902 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.693187 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.696158 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.697026 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.721929 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt2fb\" (UniqueName: \"kubernetes.io/projected/2df99b4a-9b37-4525-923f-469c5d607ce9-kube-api-access-rt2fb\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.736645 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.871874 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:34:26 crc kubenswrapper[4990]: E1003 11:34:26.872519 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:34:26 crc kubenswrapper[4990]: I1003 11:34:26.905307 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0059b5d8-652d-4a65-a5ba-cacea85aac53" path="/var/lib/kubelet/pods/0059b5d8-652d-4a65-a5ba-cacea85aac53/volumes" Oct 03 11:34:27 crc kubenswrapper[4990]: I1003 11:34:27.342204 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg"] Oct 03 11:34:27 crc kubenswrapper[4990]: I1003 11:34:27.552396 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" event={"ID":"2df99b4a-9b37-4525-923f-469c5d607ce9","Type":"ContainerStarted","Data":"09387f3bab97f1f220cef58ee8799e1515490317fa707c422e26ec7eaa1451d6"} Oct 03 11:34:28 crc kubenswrapper[4990]: I1003 11:34:28.566727 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" event={"ID":"2df99b4a-9b37-4525-923f-469c5d607ce9","Type":"ContainerStarted","Data":"a21ddaad28de374bf9a074d6992a9ca2072ec1d2bdf9b6a464821849b4f165fa"} Oct 03 11:34:28 crc kubenswrapper[4990]: I1003 11:34:28.609458 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" podStartSLOduration=2.140466973 podStartE2EDuration="2.609439013s" podCreationTimestamp="2025-10-03 11:34:26 +0000 UTC" firstStartedPulling="2025-10-03 11:34:27.349919558 +0000 UTC m=+6649.146551415" lastFinishedPulling="2025-10-03 11:34:27.818891598 +0000 UTC m=+6649.615523455" observedRunningTime="2025-10-03 11:34:28.587151859 +0000 UTC m=+6650.383783726" watchObservedRunningTime="2025-10-03 11:34:28.609439013 +0000 UTC m=+6650.406070870" Oct 03 11:34:37 crc kubenswrapper[4990]: I1003 11:34:37.047940 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-cafd-account-create-g48vw"] Oct 03 11:34:37 crc kubenswrapper[4990]: I1003 11:34:37.056893 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-cafd-account-create-g48vw"] Oct 03 11:34:38 crc kubenswrapper[4990]: I1003 11:34:38.889501 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9787f12f-7dca-40cb-a7fb-274c9ddd10a0" path="/var/lib/kubelet/pods/9787f12f-7dca-40cb-a7fb-274c9ddd10a0/volumes" Oct 03 11:34:41 crc kubenswrapper[4990]: I1003 11:34:41.871932 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:34:41 crc kubenswrapper[4990]: E1003 11:34:41.873030 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:34:43 crc kubenswrapper[4990]: I1003 11:34:43.032446 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-zxhbz"] Oct 03 11:34:43 crc kubenswrapper[4990]: I1003 11:34:43.046059 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-zxhbz"] Oct 03 11:34:44 crc kubenswrapper[4990]: I1003 11:34:44.891427 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11514f04-f626-42e7-881b-32ccab00f950" path="/var/lib/kubelet/pods/11514f04-f626-42e7-881b-32ccab00f950/volumes" Oct 03 11:34:52 crc kubenswrapper[4990]: I1003 11:34:52.529836 4990 scope.go:117] "RemoveContainer" containerID="3a969e21e0c5507e1e54697b710ed472a73639808141c9ee23ceb0b78bd6eaca" Oct 03 11:34:52 crc kubenswrapper[4990]: I1003 11:34:52.557607 4990 scope.go:117] "RemoveContainer" containerID="f4915968a7edf0a003c53824846a154d587cef54d89409e43272b381c5221ad0" Oct 03 11:34:52 crc kubenswrapper[4990]: I1003 11:34:52.640029 4990 scope.go:117] "RemoveContainer" containerID="288318739d759cf22a0b5b2936a57bcdc233bf00a39340a08b48ee34ea3618be" Oct 03 11:34:54 crc kubenswrapper[4990]: I1003 11:34:54.065580 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-2590-account-create-nzxrf"] Oct 03 11:34:54 crc kubenswrapper[4990]: I1003 11:34:54.085645 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-2590-account-create-nzxrf"] Oct 03 11:34:54 crc kubenswrapper[4990]: I1003 11:34:54.872370 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:34:54 crc kubenswrapper[4990]: E1003 11:34:54.872682 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:34:54 crc kubenswrapper[4990]: I1003 11:34:54.887580 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c979979a-92fc-4157-b350-cb02173dd164" path="/var/lib/kubelet/pods/c979979a-92fc-4157-b350-cb02173dd164/volumes" Oct 03 11:35:05 crc kubenswrapper[4990]: I1003 11:35:05.872333 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:35:05 crc kubenswrapper[4990]: E1003 11:35:05.873542 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:35:17 crc kubenswrapper[4990]: I1003 11:35:17.872714 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:35:17 crc kubenswrapper[4990]: E1003 11:35:17.873365 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:35:30 crc kubenswrapper[4990]: I1003 11:35:30.872563 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:35:31 crc kubenswrapper[4990]: I1003 11:35:31.257633 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"a9eb9254fecc128c0d2924137bcb0bd2488e7d137e2ef5a06ce31e9204ba2974"} Oct 03 11:35:47 crc kubenswrapper[4990]: I1003 11:35:47.066696 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-clsdg"] Oct 03 11:35:47 crc kubenswrapper[4990]: I1003 11:35:47.082750 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-clsdg"] Oct 03 11:35:48 crc kubenswrapper[4990]: I1003 11:35:48.882815 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07" path="/var/lib/kubelet/pods/f1b7e197-3a7b-4f72-aaa5-6ec3d5411f07/volumes" Oct 03 11:35:52 crc kubenswrapper[4990]: I1003 11:35:52.763090 4990 scope.go:117] "RemoveContainer" containerID="a684d3b015b7128a6eeae00842e106fbe7df70d2482af216758084ac8b19b217" Oct 03 11:35:52 crc kubenswrapper[4990]: I1003 11:35:52.805519 4990 scope.go:117] "RemoveContainer" containerID="a3c969f500070b9e0ad1e06107db456ee7c316d723b66879134e39c16dc93f78" Oct 03 11:35:52 crc kubenswrapper[4990]: I1003 11:35:52.882484 4990 scope.go:117] "RemoveContainer" containerID="7dda0d38a98f5ae29c3a9269f330daebd70b7007e274f6a67be6f42da1216544" Oct 03 11:36:27 crc kubenswrapper[4990]: I1003 11:36:27.760383 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9ntp5"] Oct 03 11:36:27 crc kubenswrapper[4990]: I1003 11:36:27.762816 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:27 crc kubenswrapper[4990]: I1003 11:36:27.791231 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ntp5"] Oct 03 11:36:27 crc kubenswrapper[4990]: I1003 11:36:27.904104 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-utilities\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:27 crc kubenswrapper[4990]: I1003 11:36:27.904563 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-catalog-content\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:27 crc kubenswrapper[4990]: I1003 11:36:27.904636 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8h7l\" (UniqueName: \"kubernetes.io/projected/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-kube-api-access-z8h7l\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.006889 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8h7l\" (UniqueName: \"kubernetes.io/projected/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-kube-api-access-z8h7l\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.007653 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-utilities\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.008147 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-catalog-content\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.009819 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-catalog-content\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.010116 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-utilities\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.027592 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8h7l\" (UniqueName: \"kubernetes.io/projected/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-kube-api-access-z8h7l\") pod \"community-operators-9ntp5\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.089913 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.657265 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9ntp5"] Oct 03 11:36:28 crc kubenswrapper[4990]: I1003 11:36:28.852120 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntp5" event={"ID":"fc9f12c0-f792-4afe-8c1f-9b61b251d59e","Type":"ContainerStarted","Data":"7e0ac73253f5641f9606f337ecfdab7e5ba94b3e1b5a9730801b845d61a71a51"} Oct 03 11:36:29 crc kubenswrapper[4990]: I1003 11:36:29.865136 4990 generic.go:334] "Generic (PLEG): container finished" podID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerID="80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d" exitCode=0 Oct 03 11:36:29 crc kubenswrapper[4990]: I1003 11:36:29.865189 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntp5" event={"ID":"fc9f12c0-f792-4afe-8c1f-9b61b251d59e","Type":"ContainerDied","Data":"80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d"} Oct 03 11:36:29 crc kubenswrapper[4990]: I1003 11:36:29.868374 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 11:36:32 crc kubenswrapper[4990]: I1003 11:36:32.902833 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntp5" event={"ID":"fc9f12c0-f792-4afe-8c1f-9b61b251d59e","Type":"ContainerStarted","Data":"06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e"} Oct 03 11:36:33 crc kubenswrapper[4990]: I1003 11:36:33.921590 4990 generic.go:334] "Generic (PLEG): container finished" podID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerID="06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e" exitCode=0 Oct 03 11:36:33 crc kubenswrapper[4990]: I1003 11:36:33.921649 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntp5" event={"ID":"fc9f12c0-f792-4afe-8c1f-9b61b251d59e","Type":"ContainerDied","Data":"06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e"} Oct 03 11:36:34 crc kubenswrapper[4990]: I1003 11:36:34.935307 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntp5" event={"ID":"fc9f12c0-f792-4afe-8c1f-9b61b251d59e","Type":"ContainerStarted","Data":"4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9"} Oct 03 11:36:34 crc kubenswrapper[4990]: I1003 11:36:34.958013 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9ntp5" podStartSLOduration=3.465790899 podStartE2EDuration="7.957997067s" podCreationTimestamp="2025-10-03 11:36:27 +0000 UTC" firstStartedPulling="2025-10-03 11:36:29.867989841 +0000 UTC m=+6771.664621698" lastFinishedPulling="2025-10-03 11:36:34.360196009 +0000 UTC m=+6776.156827866" observedRunningTime="2025-10-03 11:36:34.952115215 +0000 UTC m=+6776.748747082" watchObservedRunningTime="2025-10-03 11:36:34.957997067 +0000 UTC m=+6776.754628924" Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.747256 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xmqjm"] Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.749651 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.763122 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmqjm"] Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.895886 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-utilities\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.895982 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-catalog-content\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.896040 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4958\" (UniqueName: \"kubernetes.io/projected/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-kube-api-access-l4958\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.997759 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-utilities\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.998667 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-catalog-content\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:35 crc kubenswrapper[4990]: I1003 11:36:35.999375 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4958\" (UniqueName: \"kubernetes.io/projected/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-kube-api-access-l4958\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:36 crc kubenswrapper[4990]: I1003 11:36:35.999303 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-catalog-content\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:36 crc kubenswrapper[4990]: I1003 11:36:35.998358 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-utilities\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:36 crc kubenswrapper[4990]: I1003 11:36:36.019924 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4958\" (UniqueName: \"kubernetes.io/projected/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-kube-api-access-l4958\") pod \"certified-operators-xmqjm\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:36 crc kubenswrapper[4990]: I1003 11:36:36.077080 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:36 crc kubenswrapper[4990]: I1003 11:36:36.536577 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmqjm"] Oct 03 11:36:36 crc kubenswrapper[4990]: I1003 11:36:36.959153 4990 generic.go:334] "Generic (PLEG): container finished" podID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerID="47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc" exitCode=0 Oct 03 11:36:36 crc kubenswrapper[4990]: I1003 11:36:36.959251 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmqjm" event={"ID":"6cb448d7-b091-44b0-8db2-fa222f3f5ea2","Type":"ContainerDied","Data":"47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc"} Oct 03 11:36:36 crc kubenswrapper[4990]: I1003 11:36:36.959297 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmqjm" event={"ID":"6cb448d7-b091-44b0-8db2-fa222f3f5ea2","Type":"ContainerStarted","Data":"a70f7afef70b2ce5bbf0fb0dd668f59dc1c331c8f87f993eb7740942bedfe7b9"} Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.090734 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.091153 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.188880 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-flgr6"] Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.191821 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.207153 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-flgr6"] Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.207402 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.363665 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-utilities\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.363742 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-catalog-content\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.363782 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp674\" (UniqueName: \"kubernetes.io/projected/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-kube-api-access-dp674\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.470239 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-catalog-content\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.470297 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp674\" (UniqueName: \"kubernetes.io/projected/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-kube-api-access-dp674\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.470485 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-utilities\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.475324 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-catalog-content\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.475368 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-utilities\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.503441 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp674\" (UniqueName: \"kubernetes.io/projected/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-kube-api-access-dp674\") pod \"redhat-operators-flgr6\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:38 crc kubenswrapper[4990]: I1003 11:36:38.520176 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:39 crc kubenswrapper[4990]: W1003 11:36:39.130434 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca26d59_a2d4_4c91_a8f4_b5d89e7b1ac8.slice/crio-9af5ad037dfa1528d20701a562f37aef71c9fa4a5583795d2552b3352425ba65 WatchSource:0}: Error finding container 9af5ad037dfa1528d20701a562f37aef71c9fa4a5583795d2552b3352425ba65: Status 404 returned error can't find the container with id 9af5ad037dfa1528d20701a562f37aef71c9fa4a5583795d2552b3352425ba65 Oct 03 11:36:39 crc kubenswrapper[4990]: I1003 11:36:39.145821 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-flgr6"] Oct 03 11:36:40 crc kubenswrapper[4990]: I1003 11:36:40.014859 4990 generic.go:334] "Generic (PLEG): container finished" podID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerID="48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c" exitCode=0 Oct 03 11:36:40 crc kubenswrapper[4990]: I1003 11:36:40.014921 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flgr6" event={"ID":"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8","Type":"ContainerDied","Data":"48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c"} Oct 03 11:36:40 crc kubenswrapper[4990]: I1003 11:36:40.015175 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flgr6" event={"ID":"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8","Type":"ContainerStarted","Data":"9af5ad037dfa1528d20701a562f37aef71c9fa4a5583795d2552b3352425ba65"} Oct 03 11:36:45 crc kubenswrapper[4990]: I1003 11:36:45.075302 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flgr6" event={"ID":"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8","Type":"ContainerStarted","Data":"3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87"} Oct 03 11:36:45 crc kubenswrapper[4990]: I1003 11:36:45.079441 4990 generic.go:334] "Generic (PLEG): container finished" podID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerID="f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41" exitCode=0 Oct 03 11:36:45 crc kubenswrapper[4990]: I1003 11:36:45.079490 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmqjm" event={"ID":"6cb448d7-b091-44b0-8db2-fa222f3f5ea2","Type":"ContainerDied","Data":"f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41"} Oct 03 11:36:47 crc kubenswrapper[4990]: I1003 11:36:47.123923 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmqjm" event={"ID":"6cb448d7-b091-44b0-8db2-fa222f3f5ea2","Type":"ContainerStarted","Data":"d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975"} Oct 03 11:36:47 crc kubenswrapper[4990]: I1003 11:36:47.144923 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xmqjm" podStartSLOduration=2.482027806 podStartE2EDuration="12.144902238s" podCreationTimestamp="2025-10-03 11:36:35 +0000 UTC" firstStartedPulling="2025-10-03 11:36:36.96122417 +0000 UTC m=+6778.757856027" lastFinishedPulling="2025-10-03 11:36:46.624098602 +0000 UTC m=+6788.420730459" observedRunningTime="2025-10-03 11:36:47.141827398 +0000 UTC m=+6788.938459285" watchObservedRunningTime="2025-10-03 11:36:47.144902238 +0000 UTC m=+6788.941534095" Oct 03 11:36:48 crc kubenswrapper[4990]: I1003 11:36:48.176315 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:48 crc kubenswrapper[4990]: I1003 11:36:48.225590 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ntp5"] Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.147087 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9ntp5" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerName="registry-server" containerID="cri-o://4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9" gracePeriod=2 Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.661034 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.732604 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-catalog-content\") pod \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.733252 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8h7l\" (UniqueName: \"kubernetes.io/projected/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-kube-api-access-z8h7l\") pod \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.733435 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-utilities\") pod \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\" (UID: \"fc9f12c0-f792-4afe-8c1f-9b61b251d59e\") " Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.734086 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-utilities" (OuterVolumeSpecName: "utilities") pod "fc9f12c0-f792-4afe-8c1f-9b61b251d59e" (UID: "fc9f12c0-f792-4afe-8c1f-9b61b251d59e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.770847 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-kube-api-access-z8h7l" (OuterVolumeSpecName: "kube-api-access-z8h7l") pod "fc9f12c0-f792-4afe-8c1f-9b61b251d59e" (UID: "fc9f12c0-f792-4afe-8c1f-9b61b251d59e"). InnerVolumeSpecName "kube-api-access-z8h7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.803268 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc9f12c0-f792-4afe-8c1f-9b61b251d59e" (UID: "fc9f12c0-f792-4afe-8c1f-9b61b251d59e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.836474 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8h7l\" (UniqueName: \"kubernetes.io/projected/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-kube-api-access-z8h7l\") on node \"crc\" DevicePath \"\"" Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.836532 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:36:49 crc kubenswrapper[4990]: I1003 11:36:49.836551 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9f12c0-f792-4afe-8c1f-9b61b251d59e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.169397 4990 generic.go:334] "Generic (PLEG): container finished" podID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerID="3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87" exitCode=0 Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.169501 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flgr6" event={"ID":"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8","Type":"ContainerDied","Data":"3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87"} Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.173047 4990 generic.go:334] "Generic (PLEG): container finished" podID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerID="4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9" exitCode=0 Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.173082 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntp5" event={"ID":"fc9f12c0-f792-4afe-8c1f-9b61b251d59e","Type":"ContainerDied","Data":"4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9"} Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.173131 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9ntp5" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.173215 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9ntp5" event={"ID":"fc9f12c0-f792-4afe-8c1f-9b61b251d59e","Type":"ContainerDied","Data":"7e0ac73253f5641f9606f337ecfdab7e5ba94b3e1b5a9730801b845d61a71a51"} Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.173266 4990 scope.go:117] "RemoveContainer" containerID="4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.231109 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9ntp5"] Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.231442 4990 scope.go:117] "RemoveContainer" containerID="06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.240841 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9ntp5"] Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.264276 4990 scope.go:117] "RemoveContainer" containerID="80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.301974 4990 scope.go:117] "RemoveContainer" containerID="4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9" Oct 03 11:36:50 crc kubenswrapper[4990]: E1003 11:36:50.302273 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9\": container with ID starting with 4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9 not found: ID does not exist" containerID="4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.302308 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9"} err="failed to get container status \"4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9\": rpc error: code = NotFound desc = could not find container \"4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9\": container with ID starting with 4f7890f7fd123f6bbc02f73c3c7ecd7e819cefff5f50e5dfd9f17ec6a4a6eab9 not found: ID does not exist" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.302326 4990 scope.go:117] "RemoveContainer" containerID="06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e" Oct 03 11:36:50 crc kubenswrapper[4990]: E1003 11:36:50.302764 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e\": container with ID starting with 06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e not found: ID does not exist" containerID="06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.302804 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e"} err="failed to get container status \"06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e\": rpc error: code = NotFound desc = could not find container \"06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e\": container with ID starting with 06681b2998c8e0a4885175929c39c0383048b4612562f2ab5780c9123fdfdb3e not found: ID does not exist" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.302835 4990 scope.go:117] "RemoveContainer" containerID="80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d" Oct 03 11:36:50 crc kubenswrapper[4990]: E1003 11:36:50.303176 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d\": container with ID starting with 80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d not found: ID does not exist" containerID="80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.303202 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d"} err="failed to get container status \"80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d\": rpc error: code = NotFound desc = could not find container \"80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d\": container with ID starting with 80cf53c8ebf90355209ebaf9066314403b5378350932af40cd0c010588d2dc3d not found: ID does not exist" Oct 03 11:36:50 crc kubenswrapper[4990]: I1003 11:36:50.887163 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" path="/var/lib/kubelet/pods/fc9f12c0-f792-4afe-8c1f-9b61b251d59e/volumes" Oct 03 11:36:51 crc kubenswrapper[4990]: I1003 11:36:51.185315 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flgr6" event={"ID":"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8","Type":"ContainerStarted","Data":"b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3"} Oct 03 11:36:51 crc kubenswrapper[4990]: I1003 11:36:51.203648 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-flgr6" podStartSLOduration=2.331065306 podStartE2EDuration="13.20362982s" podCreationTimestamp="2025-10-03 11:36:38 +0000 UTC" firstStartedPulling="2025-10-03 11:36:40.018023952 +0000 UTC m=+6781.814655809" lastFinishedPulling="2025-10-03 11:36:50.890588466 +0000 UTC m=+6792.687220323" observedRunningTime="2025-10-03 11:36:51.202110371 +0000 UTC m=+6792.998742238" watchObservedRunningTime="2025-10-03 11:36:51.20362982 +0000 UTC m=+6793.000261677" Oct 03 11:36:56 crc kubenswrapper[4990]: I1003 11:36:56.078469 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:56 crc kubenswrapper[4990]: I1003 11:36:56.078904 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:56 crc kubenswrapper[4990]: I1003 11:36:56.139851 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:56 crc kubenswrapper[4990]: I1003 11:36:56.295103 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 11:36:56 crc kubenswrapper[4990]: I1003 11:36:56.394316 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmqjm"] Oct 03 11:36:56 crc kubenswrapper[4990]: I1003 11:36:56.435125 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgt6j"] Oct 03 11:36:56 crc kubenswrapper[4990]: I1003 11:36:56.435651 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xgt6j" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerName="registry-server" containerID="cri-o://952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413" gracePeriod=2 Oct 03 11:36:56 crc kubenswrapper[4990]: I1003 11:36:56.927946 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.015198 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbfj5\" (UniqueName: \"kubernetes.io/projected/d13713c9-9230-46ad-8b93-5ea52833d53e-kube-api-access-jbfj5\") pod \"d13713c9-9230-46ad-8b93-5ea52833d53e\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.015363 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-utilities\") pod \"d13713c9-9230-46ad-8b93-5ea52833d53e\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.015498 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-catalog-content\") pod \"d13713c9-9230-46ad-8b93-5ea52833d53e\" (UID: \"d13713c9-9230-46ad-8b93-5ea52833d53e\") " Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.016567 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-utilities" (OuterVolumeSpecName: "utilities") pod "d13713c9-9230-46ad-8b93-5ea52833d53e" (UID: "d13713c9-9230-46ad-8b93-5ea52833d53e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.038720 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13713c9-9230-46ad-8b93-5ea52833d53e-kube-api-access-jbfj5" (OuterVolumeSpecName: "kube-api-access-jbfj5") pod "d13713c9-9230-46ad-8b93-5ea52833d53e" (UID: "d13713c9-9230-46ad-8b93-5ea52833d53e"). InnerVolumeSpecName "kube-api-access-jbfj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.063010 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d13713c9-9230-46ad-8b93-5ea52833d53e" (UID: "d13713c9-9230-46ad-8b93-5ea52833d53e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.117198 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.117229 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbfj5\" (UniqueName: \"kubernetes.io/projected/d13713c9-9230-46ad-8b93-5ea52833d53e-kube-api-access-jbfj5\") on node \"crc\" DevicePath \"\"" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.117240 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d13713c9-9230-46ad-8b93-5ea52833d53e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.257380 4990 generic.go:334] "Generic (PLEG): container finished" podID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerID="952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413" exitCode=0 Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.257703 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgt6j" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.257597 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgt6j" event={"ID":"d13713c9-9230-46ad-8b93-5ea52833d53e","Type":"ContainerDied","Data":"952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413"} Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.257994 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgt6j" event={"ID":"d13713c9-9230-46ad-8b93-5ea52833d53e","Type":"ContainerDied","Data":"42b6917ab0d6e650041bd6e641c9ae655a96e0bd502a6e515fbc409bca811757"} Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.258151 4990 scope.go:117] "RemoveContainer" containerID="952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.297804 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgt6j"] Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.298405 4990 scope.go:117] "RemoveContainer" containerID="29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.308997 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xgt6j"] Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.324901 4990 scope.go:117] "RemoveContainer" containerID="dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.371777 4990 scope.go:117] "RemoveContainer" containerID="952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413" Oct 03 11:36:57 crc kubenswrapper[4990]: E1003 11:36:57.373593 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413\": container with ID starting with 952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413 not found: ID does not exist" containerID="952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.373674 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413"} err="failed to get container status \"952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413\": rpc error: code = NotFound desc = could not find container \"952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413\": container with ID starting with 952f2a8ffb22cdf837f9440c5a84d6013744aade131c0817248ec333a1092413 not found: ID does not exist" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.373730 4990 scope.go:117] "RemoveContainer" containerID="29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60" Oct 03 11:36:57 crc kubenswrapper[4990]: E1003 11:36:57.375703 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60\": container with ID starting with 29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60 not found: ID does not exist" containerID="29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.375743 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60"} err="failed to get container status \"29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60\": rpc error: code = NotFound desc = could not find container \"29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60\": container with ID starting with 29e123d4131de966e4cef491ba1a24dfa3c5b3b99c0e33dcc2d8b8e5abeb9c60 not found: ID does not exist" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.375771 4990 scope.go:117] "RemoveContainer" containerID="dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8" Oct 03 11:36:57 crc kubenswrapper[4990]: E1003 11:36:57.376090 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8\": container with ID starting with dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8 not found: ID does not exist" containerID="dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8" Oct 03 11:36:57 crc kubenswrapper[4990]: I1003 11:36:57.376105 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8"} err="failed to get container status \"dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8\": rpc error: code = NotFound desc = could not find container \"dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8\": container with ID starting with dee11c741ede20186d586e2b1f24625633d44d75ed9625fbc6f302a52a285ff8 not found: ID does not exist" Oct 03 11:36:58 crc kubenswrapper[4990]: I1003 11:36:58.521017 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:58 crc kubenswrapper[4990]: I1003 11:36:58.521315 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:58 crc kubenswrapper[4990]: I1003 11:36:58.575449 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:36:58 crc kubenswrapper[4990]: I1003 11:36:58.892980 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" path="/var/lib/kubelet/pods/d13713c9-9230-46ad-8b93-5ea52833d53e/volumes" Oct 03 11:36:59 crc kubenswrapper[4990]: I1003 11:36:59.354096 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.382011 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-flgr6"] Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.382792 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-flgr6" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerName="registry-server" containerID="cri-o://b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3" gracePeriod=2 Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.887625 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.940116 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-catalog-content\") pod \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.940334 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-utilities\") pod \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.941273 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-utilities" (OuterVolumeSpecName: "utilities") pod "1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" (UID: "1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.941415 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp674\" (UniqueName: \"kubernetes.io/projected/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-kube-api-access-dp674\") pod \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\" (UID: \"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8\") " Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.943189 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:37:01 crc kubenswrapper[4990]: I1003 11:37:01.953554 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-kube-api-access-dp674" (OuterVolumeSpecName: "kube-api-access-dp674") pod "1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" (UID: "1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8"). InnerVolumeSpecName "kube-api-access-dp674". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.046301 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp674\" (UniqueName: \"kubernetes.io/projected/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-kube-api-access-dp674\") on node \"crc\" DevicePath \"\"" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.074116 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" (UID: "1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.148899 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.319846 4990 generic.go:334] "Generic (PLEG): container finished" podID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerID="b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3" exitCode=0 Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.319923 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-flgr6" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.319945 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flgr6" event={"ID":"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8","Type":"ContainerDied","Data":"b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3"} Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.320303 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-flgr6" event={"ID":"1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8","Type":"ContainerDied","Data":"9af5ad037dfa1528d20701a562f37aef71c9fa4a5583795d2552b3352425ba65"} Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.320324 4990 scope.go:117] "RemoveContainer" containerID="b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.362016 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-flgr6"] Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.371073 4990 scope.go:117] "RemoveContainer" containerID="3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.376311 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-flgr6"] Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.415864 4990 scope.go:117] "RemoveContainer" containerID="48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.489628 4990 scope.go:117] "RemoveContainer" containerID="b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3" Oct 03 11:37:02 crc kubenswrapper[4990]: E1003 11:37:02.490051 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3\": container with ID starting with b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3 not found: ID does not exist" containerID="b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.490078 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3"} err="failed to get container status \"b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3\": rpc error: code = NotFound desc = could not find container \"b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3\": container with ID starting with b5df12a1d08d33c58360bb44da5146c606f6dbd1131c5cacbfb9eca8919bfba3 not found: ID does not exist" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.490095 4990 scope.go:117] "RemoveContainer" containerID="3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87" Oct 03 11:37:02 crc kubenswrapper[4990]: E1003 11:37:02.492035 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87\": container with ID starting with 3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87 not found: ID does not exist" containerID="3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.492085 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87"} err="failed to get container status \"3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87\": rpc error: code = NotFound desc = could not find container \"3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87\": container with ID starting with 3341fa070bffbc3d28b50634db274a74fac122c58684f6374e755b2670abda87 not found: ID does not exist" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.492113 4990 scope.go:117] "RemoveContainer" containerID="48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c" Oct 03 11:37:02 crc kubenswrapper[4990]: E1003 11:37:02.492750 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c\": container with ID starting with 48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c not found: ID does not exist" containerID="48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.492785 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c"} err="failed to get container status \"48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c\": rpc error: code = NotFound desc = could not find container \"48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c\": container with ID starting with 48db0b2b70106a5d6481bc0b2fd9dc8df34ad93f891f21a163bcbb27d423b46c not found: ID does not exist" Oct 03 11:37:02 crc kubenswrapper[4990]: I1003 11:37:02.895433 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" path="/var/lib/kubelet/pods/1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8/volumes" Oct 03 11:37:55 crc kubenswrapper[4990]: I1003 11:37:55.303575 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:37:55 crc kubenswrapper[4990]: I1003 11:37:55.304667 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.387037 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tzqd8"] Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388187 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerName="extract-utilities" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388205 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerName="extract-utilities" Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388230 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388237 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388253 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerName="extract-content" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388260 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerName="extract-content" Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388283 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388291 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388308 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388315 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388330 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerName="extract-content" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388337 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerName="extract-content" Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388354 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerName="extract-content" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388361 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerName="extract-content" Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388373 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerName="extract-utilities" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388380 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerName="extract-utilities" Oct 03 11:38:04 crc kubenswrapper[4990]: E1003 11:38:04.388417 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerName="extract-utilities" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388425 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerName="extract-utilities" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388670 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9f12c0-f792-4afe-8c1f-9b61b251d59e" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388688 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca26d59-a2d4-4c91-a8f4-b5d89e7b1ac8" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.388708 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13713c9-9230-46ad-8b93-5ea52833d53e" containerName="registry-server" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.390722 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.403545 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzqd8"] Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.437614 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-catalog-content\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.437891 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h22t6\" (UniqueName: \"kubernetes.io/projected/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-kube-api-access-h22t6\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.437988 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-utilities\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.539813 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-catalog-content\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.540106 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h22t6\" (UniqueName: \"kubernetes.io/projected/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-kube-api-access-h22t6\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.540209 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-utilities\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.540726 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-utilities\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.541021 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-catalog-content\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.571478 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h22t6\" (UniqueName: \"kubernetes.io/projected/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-kube-api-access-h22t6\") pod \"redhat-marketplace-tzqd8\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:04 crc kubenswrapper[4990]: I1003 11:38:04.722678 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:05 crc kubenswrapper[4990]: I1003 11:38:05.174043 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzqd8"] Oct 03 11:38:06 crc kubenswrapper[4990]: I1003 11:38:06.018013 4990 generic.go:334] "Generic (PLEG): container finished" podID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerID="b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063" exitCode=0 Oct 03 11:38:06 crc kubenswrapper[4990]: I1003 11:38:06.018054 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzqd8" event={"ID":"25d1a394-60c7-4e84-a15f-ff07a7e4ec46","Type":"ContainerDied","Data":"b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063"} Oct 03 11:38:06 crc kubenswrapper[4990]: I1003 11:38:06.018278 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzqd8" event={"ID":"25d1a394-60c7-4e84-a15f-ff07a7e4ec46","Type":"ContainerStarted","Data":"bf9bef614d9383f3953c2290414c981ea3c17e31e362fef0dcad412a1bb1591b"} Oct 03 11:38:07 crc kubenswrapper[4990]: I1003 11:38:07.029826 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzqd8" event={"ID":"25d1a394-60c7-4e84-a15f-ff07a7e4ec46","Type":"ContainerStarted","Data":"6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e"} Oct 03 11:38:08 crc kubenswrapper[4990]: I1003 11:38:08.042153 4990 generic.go:334] "Generic (PLEG): container finished" podID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerID="6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e" exitCode=0 Oct 03 11:38:08 crc kubenswrapper[4990]: I1003 11:38:08.042253 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzqd8" event={"ID":"25d1a394-60c7-4e84-a15f-ff07a7e4ec46","Type":"ContainerDied","Data":"6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e"} Oct 03 11:38:09 crc kubenswrapper[4990]: I1003 11:38:09.061347 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzqd8" event={"ID":"25d1a394-60c7-4e84-a15f-ff07a7e4ec46","Type":"ContainerStarted","Data":"2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301"} Oct 03 11:38:09 crc kubenswrapper[4990]: I1003 11:38:09.102390 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tzqd8" podStartSLOduration=2.487021684 podStartE2EDuration="5.102371125s" podCreationTimestamp="2025-10-03 11:38:04 +0000 UTC" firstStartedPulling="2025-10-03 11:38:06.019551172 +0000 UTC m=+6867.816183029" lastFinishedPulling="2025-10-03 11:38:08.634900613 +0000 UTC m=+6870.431532470" observedRunningTime="2025-10-03 11:38:09.089888813 +0000 UTC m=+6870.886520680" watchObservedRunningTime="2025-10-03 11:38:09.102371125 +0000 UTC m=+6870.899002982" Oct 03 11:38:14 crc kubenswrapper[4990]: I1003 11:38:14.723303 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:14 crc kubenswrapper[4990]: I1003 11:38:14.724813 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:14 crc kubenswrapper[4990]: I1003 11:38:14.774548 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:15 crc kubenswrapper[4990]: I1003 11:38:15.182229 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:15 crc kubenswrapper[4990]: I1003 11:38:15.248095 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzqd8"] Oct 03 11:38:17 crc kubenswrapper[4990]: I1003 11:38:17.144550 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tzqd8" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerName="registry-server" containerID="cri-o://2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301" gracePeriod=2 Oct 03 11:38:17 crc kubenswrapper[4990]: I1003 11:38:17.766031 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:17 crc kubenswrapper[4990]: I1003 11:38:17.935265 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-catalog-content\") pod \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " Oct 03 11:38:17 crc kubenswrapper[4990]: I1003 11:38:17.935655 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h22t6\" (UniqueName: \"kubernetes.io/projected/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-kube-api-access-h22t6\") pod \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " Oct 03 11:38:17 crc kubenswrapper[4990]: I1003 11:38:17.935708 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-utilities\") pod \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\" (UID: \"25d1a394-60c7-4e84-a15f-ff07a7e4ec46\") " Oct 03 11:38:17 crc kubenswrapper[4990]: I1003 11:38:17.937045 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-utilities" (OuterVolumeSpecName: "utilities") pod "25d1a394-60c7-4e84-a15f-ff07a7e4ec46" (UID: "25d1a394-60c7-4e84-a15f-ff07a7e4ec46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:38:17 crc kubenswrapper[4990]: I1003 11:38:17.948857 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-kube-api-access-h22t6" (OuterVolumeSpecName: "kube-api-access-h22t6") pod "25d1a394-60c7-4e84-a15f-ff07a7e4ec46" (UID: "25d1a394-60c7-4e84-a15f-ff07a7e4ec46"). InnerVolumeSpecName "kube-api-access-h22t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:38:17 crc kubenswrapper[4990]: I1003 11:38:17.953679 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25d1a394-60c7-4e84-a15f-ff07a7e4ec46" (UID: "25d1a394-60c7-4e84-a15f-ff07a7e4ec46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.038707 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.039056 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h22t6\" (UniqueName: \"kubernetes.io/projected/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-kube-api-access-h22t6\") on node \"crc\" DevicePath \"\"" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.039228 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d1a394-60c7-4e84-a15f-ff07a7e4ec46-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.165222 4990 generic.go:334] "Generic (PLEG): container finished" podID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerID="2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301" exitCode=0 Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.165285 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzqd8" event={"ID":"25d1a394-60c7-4e84-a15f-ff07a7e4ec46","Type":"ContainerDied","Data":"2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301"} Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.165320 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzqd8" event={"ID":"25d1a394-60c7-4e84-a15f-ff07a7e4ec46","Type":"ContainerDied","Data":"bf9bef614d9383f3953c2290414c981ea3c17e31e362fef0dcad412a1bb1591b"} Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.165358 4990 scope.go:117] "RemoveContainer" containerID="2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.165614 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzqd8" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.189916 4990 scope.go:117] "RemoveContainer" containerID="6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.211860 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzqd8"] Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.218799 4990 scope.go:117] "RemoveContainer" containerID="b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.225922 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzqd8"] Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.269921 4990 scope.go:117] "RemoveContainer" containerID="2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301" Oct 03 11:38:18 crc kubenswrapper[4990]: E1003 11:38:18.270421 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301\": container with ID starting with 2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301 not found: ID does not exist" containerID="2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.270557 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301"} err="failed to get container status \"2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301\": rpc error: code = NotFound desc = could not find container \"2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301\": container with ID starting with 2ed7a0dacfb73f0f0156e92c80f3bb9e0dfa904308d810d671cec04e3ea6e301 not found: ID does not exist" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.270696 4990 scope.go:117] "RemoveContainer" containerID="6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e" Oct 03 11:38:18 crc kubenswrapper[4990]: E1003 11:38:18.271151 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e\": container with ID starting with 6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e not found: ID does not exist" containerID="6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.271246 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e"} err="failed to get container status \"6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e\": rpc error: code = NotFound desc = could not find container \"6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e\": container with ID starting with 6b5f8affb6fd156bd736b054dd9143fd03a2f95dd9228239e7f89524bf86ea7e not found: ID does not exist" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.271321 4990 scope.go:117] "RemoveContainer" containerID="b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063" Oct 03 11:38:18 crc kubenswrapper[4990]: E1003 11:38:18.271726 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063\": container with ID starting with b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063 not found: ID does not exist" containerID="b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.271773 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063"} err="failed to get container status \"b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063\": rpc error: code = NotFound desc = could not find container \"b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063\": container with ID starting with b5fd398c70c1f1e7f4193c5fea1c7a46f0a613d0688de84c3fbe6a10305fb063 not found: ID does not exist" Oct 03 11:38:18 crc kubenswrapper[4990]: I1003 11:38:18.885157 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" path="/var/lib/kubelet/pods/25d1a394-60c7-4e84-a15f-ff07a7e4ec46/volumes" Oct 03 11:38:25 crc kubenswrapper[4990]: I1003 11:38:25.303979 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:38:25 crc kubenswrapper[4990]: I1003 11:38:25.305029 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:38:55 crc kubenswrapper[4990]: I1003 11:38:55.303499 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:38:55 crc kubenswrapper[4990]: I1003 11:38:55.304389 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:38:55 crc kubenswrapper[4990]: I1003 11:38:55.304438 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:38:55 crc kubenswrapper[4990]: I1003 11:38:55.305474 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9eb9254fecc128c0d2924137bcb0bd2488e7d137e2ef5a06ce31e9204ba2974"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:38:55 crc kubenswrapper[4990]: I1003 11:38:55.305543 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://a9eb9254fecc128c0d2924137bcb0bd2488e7d137e2ef5a06ce31e9204ba2974" gracePeriod=600 Oct 03 11:38:55 crc kubenswrapper[4990]: I1003 11:38:55.638660 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="a9eb9254fecc128c0d2924137bcb0bd2488e7d137e2ef5a06ce31e9204ba2974" exitCode=0 Oct 03 11:38:55 crc kubenswrapper[4990]: I1003 11:38:55.638752 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"a9eb9254fecc128c0d2924137bcb0bd2488e7d137e2ef5a06ce31e9204ba2974"} Oct 03 11:38:55 crc kubenswrapper[4990]: I1003 11:38:55.639281 4990 scope.go:117] "RemoveContainer" containerID="c1dd72ae48413668e59c2d51f5705c595824c8c56437e1708877346ed21ed8d1" Oct 03 11:38:56 crc kubenswrapper[4990]: I1003 11:38:56.662129 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1"} Oct 03 11:39:25 crc kubenswrapper[4990]: I1003 11:39:25.045364 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-fkm95"] Oct 03 11:39:25 crc kubenswrapper[4990]: I1003 11:39:25.053719 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-fkm95"] Oct 03 11:39:26 crc kubenswrapper[4990]: I1003 11:39:26.891118 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a886fcb0-74f6-4fde-a081-7d56b2174761" path="/var/lib/kubelet/pods/a886fcb0-74f6-4fde-a081-7d56b2174761/volumes" Oct 03 11:39:35 crc kubenswrapper[4990]: I1003 11:39:35.049142 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cade-account-create-9ckfv"] Oct 03 11:39:35 crc kubenswrapper[4990]: I1003 11:39:35.065826 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cade-account-create-9ckfv"] Oct 03 11:39:36 crc kubenswrapper[4990]: I1003 11:39:36.887806 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313ded8f-3fe0-45db-8834-23a72733168f" path="/var/lib/kubelet/pods/313ded8f-3fe0-45db-8834-23a72733168f/volumes" Oct 03 11:39:49 crc kubenswrapper[4990]: I1003 11:39:49.054530 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-54f7b"] Oct 03 11:39:49 crc kubenswrapper[4990]: I1003 11:39:49.071692 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-54f7b"] Oct 03 11:39:50 crc kubenswrapper[4990]: I1003 11:39:50.885354 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfc24b10-6ae7-478b-ad51-7339be03dbcb" path="/var/lib/kubelet/pods/bfc24b10-6ae7-478b-ad51-7339be03dbcb/volumes" Oct 03 11:39:53 crc kubenswrapper[4990]: I1003 11:39:53.151005 4990 scope.go:117] "RemoveContainer" containerID="2adde44c0b58fab7053781bea87807693933a6d9f757062e26d4a8d0245178e3" Oct 03 11:39:53 crc kubenswrapper[4990]: I1003 11:39:53.191397 4990 scope.go:117] "RemoveContainer" containerID="81e0bcb214bfda1a5f64b9257db50cd9bbee7a6e6d1388f76a0429709fc828fb" Oct 03 11:39:53 crc kubenswrapper[4990]: I1003 11:39:53.226321 4990 scope.go:117] "RemoveContainer" containerID="9ec57d983c322df18e215edc53c421a052eb2202debaa897a20ed35052f3883f" Oct 03 11:39:53 crc kubenswrapper[4990]: I1003 11:39:53.320142 4990 scope.go:117] "RemoveContainer" containerID="71d0ce6656858ea8aa19f8da768656c026cd07cebb8a5daa2fe32779ab262b15" Oct 03 11:39:53 crc kubenswrapper[4990]: I1003 11:39:53.368091 4990 scope.go:117] "RemoveContainer" containerID="c56359e95628c4d91cb2d160306679e7765c7be89ce2210660495a3bae0ec2d3" Oct 03 11:40:55 crc kubenswrapper[4990]: I1003 11:40:55.303916 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:40:55 crc kubenswrapper[4990]: I1003 11:40:55.304411 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:41:25 crc kubenswrapper[4990]: I1003 11:41:25.303629 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:41:25 crc kubenswrapper[4990]: I1003 11:41:25.304259 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.303946 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.304651 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.304713 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.305835 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.305929 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" gracePeriod=600 Oct 03 11:41:55 crc kubenswrapper[4990]: E1003 11:41:55.447678 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.669865 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" exitCode=0 Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.670647 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1"} Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.670768 4990 scope.go:117] "RemoveContainer" containerID="a9eb9254fecc128c0d2924137bcb0bd2488e7d137e2ef5a06ce31e9204ba2974" Oct 03 11:41:55 crc kubenswrapper[4990]: I1003 11:41:55.673242 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:41:55 crc kubenswrapper[4990]: E1003 11:41:55.674767 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:42:08 crc kubenswrapper[4990]: I1003 11:42:08.886819 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:42:08 crc kubenswrapper[4990]: E1003 11:42:08.890339 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:42:09 crc kubenswrapper[4990]: I1003 11:42:09.058332 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-fxxg9"] Oct 03 11:42:09 crc kubenswrapper[4990]: I1003 11:42:09.084185 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-fxxg9"] Oct 03 11:42:10 crc kubenswrapper[4990]: I1003 11:42:10.888071 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e224c1a-fc5b-4920-b8f7-05343df90890" path="/var/lib/kubelet/pods/7e224c1a-fc5b-4920-b8f7-05343df90890/volumes" Oct 03 11:42:19 crc kubenswrapper[4990]: I1003 11:42:19.053702 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-1f1f-account-create-kw9mj"] Oct 03 11:42:19 crc kubenswrapper[4990]: I1003 11:42:19.066475 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-1f1f-account-create-kw9mj"] Oct 03 11:42:19 crc kubenswrapper[4990]: I1003 11:42:19.872125 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:42:19 crc kubenswrapper[4990]: E1003 11:42:19.872741 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:42:20 crc kubenswrapper[4990]: I1003 11:42:20.887989 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc4a764-22f0-4166-b68a-b4f45b864d3c" path="/var/lib/kubelet/pods/5fc4a764-22f0-4166-b68a-b4f45b864d3c/volumes" Oct 03 11:42:30 crc kubenswrapper[4990]: I1003 11:42:30.872150 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:42:30 crc kubenswrapper[4990]: E1003 11:42:30.873014 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:42:32 crc kubenswrapper[4990]: I1003 11:42:32.064396 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-5cdxf"] Oct 03 11:42:32 crc kubenswrapper[4990]: I1003 11:42:32.077107 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-5cdxf"] Oct 03 11:42:32 crc kubenswrapper[4990]: I1003 11:42:32.886285 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f453696-18c2-4d5d-a8e4-051dd710c1e4" path="/var/lib/kubelet/pods/4f453696-18c2-4d5d-a8e4-051dd710c1e4/volumes" Oct 03 11:42:43 crc kubenswrapper[4990]: I1003 11:42:43.873048 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:42:43 crc kubenswrapper[4990]: E1003 11:42:43.874437 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:42:53 crc kubenswrapper[4990]: I1003 11:42:53.588392 4990 scope.go:117] "RemoveContainer" containerID="d55095ecf406905075e0bc767c8cd3f4852e55bd72d1c1151bc6f4e9e947cd43" Oct 03 11:42:53 crc kubenswrapper[4990]: I1003 11:42:53.620751 4990 scope.go:117] "RemoveContainer" containerID="c822c5c834509db335b91e0ad6cb8eeda9b4e950669ffd04590371f056abe4c4" Oct 03 11:42:53 crc kubenswrapper[4990]: I1003 11:42:53.692593 4990 scope.go:117] "RemoveContainer" containerID="d258f363f562fdbfbb7cb9557de6e67b8d50ea78d3876f8741bb7412aa413c6e" Oct 03 11:42:55 crc kubenswrapper[4990]: I1003 11:42:55.872610 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:42:55 crc kubenswrapper[4990]: E1003 11:42:55.873658 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:43:08 crc kubenswrapper[4990]: I1003 11:43:08.879107 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:43:08 crc kubenswrapper[4990]: E1003 11:43:08.880046 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:43:21 crc kubenswrapper[4990]: I1003 11:43:21.873323 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:43:21 crc kubenswrapper[4990]: E1003 11:43:21.874757 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:43:35 crc kubenswrapper[4990]: I1003 11:43:35.872218 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:43:35 crc kubenswrapper[4990]: E1003 11:43:35.873164 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:43:47 crc kubenswrapper[4990]: I1003 11:43:47.872707 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:43:47 crc kubenswrapper[4990]: E1003 11:43:47.875348 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:44:00 crc kubenswrapper[4990]: I1003 11:44:00.873581 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:44:00 crc kubenswrapper[4990]: E1003 11:44:00.874602 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:44:13 crc kubenswrapper[4990]: I1003 11:44:13.872089 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:44:13 crc kubenswrapper[4990]: E1003 11:44:13.873330 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:44:26 crc kubenswrapper[4990]: I1003 11:44:26.872779 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:44:26 crc kubenswrapper[4990]: E1003 11:44:26.873655 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:44:40 crc kubenswrapper[4990]: I1003 11:44:40.871890 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:44:40 crc kubenswrapper[4990]: E1003 11:44:40.872906 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:44:54 crc kubenswrapper[4990]: I1003 11:44:54.872036 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:44:54 crc kubenswrapper[4990]: E1003 11:44:54.872845 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.183885 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj"] Oct 03 11:45:00 crc kubenswrapper[4990]: E1003 11:45:00.187033 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerName="extract-content" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.187175 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerName="extract-content" Oct 03 11:45:00 crc kubenswrapper[4990]: E1003 11:45:00.187292 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerName="extract-utilities" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.187371 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerName="extract-utilities" Oct 03 11:45:00 crc kubenswrapper[4990]: E1003 11:45:00.187458 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerName="registry-server" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.187563 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerName="registry-server" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.187907 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d1a394-60c7-4e84-a15f-ff07a7e4ec46" containerName="registry-server" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.189075 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.192925 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.193771 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.202609 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj"] Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.238684 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhrpb\" (UniqueName: \"kubernetes.io/projected/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-kube-api-access-lhrpb\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.238850 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-config-volume\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.238928 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-secret-volume\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.340629 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-config-volume\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.340703 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-secret-volume\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.340871 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhrpb\" (UniqueName: \"kubernetes.io/projected/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-kube-api-access-lhrpb\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.341793 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-config-volume\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.359764 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-secret-volume\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.364284 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhrpb\" (UniqueName: \"kubernetes.io/projected/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-kube-api-access-lhrpb\") pod \"collect-profiles-29324865-h7tzj\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:00 crc kubenswrapper[4990]: I1003 11:45:00.513183 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:01 crc kubenswrapper[4990]: I1003 11:45:01.011696 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj"] Oct 03 11:45:01 crc kubenswrapper[4990]: I1003 11:45:01.709891 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" event={"ID":"c5b184dd-01eb-4959-a5ff-cebaf932f6fe","Type":"ContainerStarted","Data":"31ebcdb2e66616577868679543f8fedc107d3c95641655f36a795f60c9dc6c85"} Oct 03 11:45:01 crc kubenswrapper[4990]: I1003 11:45:01.710275 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" event={"ID":"c5b184dd-01eb-4959-a5ff-cebaf932f6fe","Type":"ContainerStarted","Data":"e08dc194b4d8a1eefb791374ac9016d8b0f93eaf73e86710c7d34b1ef95538bd"} Oct 03 11:45:01 crc kubenswrapper[4990]: I1003 11:45:01.748697 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" podStartSLOduration=1.748670577 podStartE2EDuration="1.748670577s" podCreationTimestamp="2025-10-03 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 11:45:01.733261149 +0000 UTC m=+7283.529893036" watchObservedRunningTime="2025-10-03 11:45:01.748670577 +0000 UTC m=+7283.545302444" Oct 03 11:45:02 crc kubenswrapper[4990]: I1003 11:45:02.722064 4990 generic.go:334] "Generic (PLEG): container finished" podID="c5b184dd-01eb-4959-a5ff-cebaf932f6fe" containerID="31ebcdb2e66616577868679543f8fedc107d3c95641655f36a795f60c9dc6c85" exitCode=0 Oct 03 11:45:02 crc kubenswrapper[4990]: I1003 11:45:02.722115 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" event={"ID":"c5b184dd-01eb-4959-a5ff-cebaf932f6fe","Type":"ContainerDied","Data":"31ebcdb2e66616577868679543f8fedc107d3c95641655f36a795f60c9dc6c85"} Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.105086 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.122575 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhrpb\" (UniqueName: \"kubernetes.io/projected/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-kube-api-access-lhrpb\") pod \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.122692 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-config-volume\") pod \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.122795 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-secret-volume\") pod \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\" (UID: \"c5b184dd-01eb-4959-a5ff-cebaf932f6fe\") " Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.124412 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5b184dd-01eb-4959-a5ff-cebaf932f6fe" (UID: "c5b184dd-01eb-4959-a5ff-cebaf932f6fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.130838 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-kube-api-access-lhrpb" (OuterVolumeSpecName: "kube-api-access-lhrpb") pod "c5b184dd-01eb-4959-a5ff-cebaf932f6fe" (UID: "c5b184dd-01eb-4959-a5ff-cebaf932f6fe"). InnerVolumeSpecName "kube-api-access-lhrpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.133862 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c5b184dd-01eb-4959-a5ff-cebaf932f6fe" (UID: "c5b184dd-01eb-4959-a5ff-cebaf932f6fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.224485 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.224572 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.224587 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhrpb\" (UniqueName: \"kubernetes.io/projected/c5b184dd-01eb-4959-a5ff-cebaf932f6fe-kube-api-access-lhrpb\") on node \"crc\" DevicePath \"\"" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.741157 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" event={"ID":"c5b184dd-01eb-4959-a5ff-cebaf932f6fe","Type":"ContainerDied","Data":"e08dc194b4d8a1eefb791374ac9016d8b0f93eaf73e86710c7d34b1ef95538bd"} Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.741213 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e08dc194b4d8a1eefb791374ac9016d8b0f93eaf73e86710c7d34b1ef95538bd" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.741211 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj" Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.825621 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5"] Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.838805 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324820-gffx5"] Oct 03 11:45:04 crc kubenswrapper[4990]: I1003 11:45:04.883080 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f70e51-844b-4117-84d9-ea9b8b10b65a" path="/var/lib/kubelet/pods/86f70e51-844b-4117-84d9-ea9b8b10b65a/volumes" Oct 03 11:45:05 crc kubenswrapper[4990]: I1003 11:45:05.871711 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:45:05 crc kubenswrapper[4990]: E1003 11:45:05.872389 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:45:17 crc kubenswrapper[4990]: I1003 11:45:17.871970 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:45:17 crc kubenswrapper[4990]: E1003 11:45:17.872872 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:45:28 crc kubenswrapper[4990]: I1003 11:45:28.881045 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:45:28 crc kubenswrapper[4990]: E1003 11:45:28.882095 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:45:38 crc kubenswrapper[4990]: I1003 11:45:38.084787 4990 generic.go:334] "Generic (PLEG): container finished" podID="2df99b4a-9b37-4525-923f-469c5d607ce9" containerID="a21ddaad28de374bf9a074d6992a9ca2072ec1d2bdf9b6a464821849b4f165fa" exitCode=0 Oct 03 11:45:38 crc kubenswrapper[4990]: I1003 11:45:38.084907 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" event={"ID":"2df99b4a-9b37-4525-923f-469c5d607ce9","Type":"ContainerDied","Data":"a21ddaad28de374bf9a074d6992a9ca2072ec1d2bdf9b6a464821849b4f165fa"} Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.544912 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.613769 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt2fb\" (UniqueName: \"kubernetes.io/projected/2df99b4a-9b37-4525-923f-469c5d607ce9-kube-api-access-rt2fb\") pod \"2df99b4a-9b37-4525-923f-469c5d607ce9\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.613949 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-ssh-key\") pod \"2df99b4a-9b37-4525-923f-469c5d607ce9\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.614076 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-inventory\") pod \"2df99b4a-9b37-4525-923f-469c5d607ce9\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.614153 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-tripleo-cleanup-combined-ca-bundle\") pod \"2df99b4a-9b37-4525-923f-469c5d607ce9\" (UID: \"2df99b4a-9b37-4525-923f-469c5d607ce9\") " Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.632824 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "2df99b4a-9b37-4525-923f-469c5d607ce9" (UID: "2df99b4a-9b37-4525-923f-469c5d607ce9"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.632932 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df99b4a-9b37-4525-923f-469c5d607ce9-kube-api-access-rt2fb" (OuterVolumeSpecName: "kube-api-access-rt2fb") pod "2df99b4a-9b37-4525-923f-469c5d607ce9" (UID: "2df99b4a-9b37-4525-923f-469c5d607ce9"). InnerVolumeSpecName "kube-api-access-rt2fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.661202 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-inventory" (OuterVolumeSpecName: "inventory") pod "2df99b4a-9b37-4525-923f-469c5d607ce9" (UID: "2df99b4a-9b37-4525-923f-469c5d607ce9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.662039 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2df99b4a-9b37-4525-923f-469c5d607ce9" (UID: "2df99b4a-9b37-4525-923f-469c5d607ce9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.719205 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt2fb\" (UniqueName: \"kubernetes.io/projected/2df99b4a-9b37-4525-923f-469c5d607ce9-kube-api-access-rt2fb\") on node \"crc\" DevicePath \"\"" Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.719244 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.719257 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:45:39 crc kubenswrapper[4990]: I1003 11:45:39.719271 4990 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df99b4a-9b37-4525-923f-469c5d607ce9-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:45:40 crc kubenswrapper[4990]: I1003 11:45:40.111305 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" event={"ID":"2df99b4a-9b37-4525-923f-469c5d607ce9","Type":"ContainerDied","Data":"09387f3bab97f1f220cef58ee8799e1515490317fa707c422e26ec7eaa1451d6"} Oct 03 11:45:40 crc kubenswrapper[4990]: I1003 11:45:40.111358 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09387f3bab97f1f220cef58ee8799e1515490317fa707c422e26ec7eaa1451d6" Oct 03 11:45:40 crc kubenswrapper[4990]: I1003 11:45:40.111407 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg" Oct 03 11:45:43 crc kubenswrapper[4990]: I1003 11:45:43.872434 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:45:43 crc kubenswrapper[4990]: E1003 11:45:43.873336 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.674344 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-9nksq"] Oct 03 11:45:49 crc kubenswrapper[4990]: E1003 11:45:49.676571 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df99b4a-9b37-4525-923f-469c5d607ce9" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.676592 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df99b4a-9b37-4525-923f-469c5d607ce9" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 03 11:45:49 crc kubenswrapper[4990]: E1003 11:45:49.676825 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b184dd-01eb-4959-a5ff-cebaf932f6fe" containerName="collect-profiles" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.676832 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b184dd-01eb-4959-a5ff-cebaf932f6fe" containerName="collect-profiles" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.677080 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df99b4a-9b37-4525-923f-469c5d607ce9" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.677106 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b184dd-01eb-4959-a5ff-cebaf932f6fe" containerName="collect-profiles" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.678043 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.683074 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.683149 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.683255 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.683092 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.684239 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-9nksq"] Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.836801 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.837165 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.837349 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-inventory\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.837656 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5rq\" (UniqueName: \"kubernetes.io/projected/ddc5184e-8071-42ec-85d3-2bf5eba352ab-kube-api-access-lc5rq\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.939947 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5rq\" (UniqueName: \"kubernetes.io/projected/ddc5184e-8071-42ec-85d3-2bf5eba352ab-kube-api-access-lc5rq\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.940104 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.940154 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.940298 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-inventory\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.945804 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.945837 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.948139 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-inventory\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.964115 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5rq\" (UniqueName: \"kubernetes.io/projected/ddc5184e-8071-42ec-85d3-2bf5eba352ab-kube-api-access-lc5rq\") pod \"bootstrap-openstack-openstack-cell1-9nksq\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:49 crc kubenswrapper[4990]: I1003 11:45:49.999449 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:45:50 crc kubenswrapper[4990]: I1003 11:45:50.545258 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-9nksq"] Oct 03 11:45:50 crc kubenswrapper[4990]: I1003 11:45:50.571733 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 11:45:51 crc kubenswrapper[4990]: I1003 11:45:51.233325 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" event={"ID":"ddc5184e-8071-42ec-85d3-2bf5eba352ab","Type":"ContainerStarted","Data":"12b3ba3b7797e73daf965cd419106001eac93962343a7c63951a134abee9a59c"} Oct 03 11:45:52 crc kubenswrapper[4990]: I1003 11:45:52.256543 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" event={"ID":"ddc5184e-8071-42ec-85d3-2bf5eba352ab","Type":"ContainerStarted","Data":"b36c3bf2537d217d066cb5640d749dd9deaf75d47a1d1e8671aaa2862ea6742d"} Oct 03 11:45:52 crc kubenswrapper[4990]: I1003 11:45:52.283628 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" podStartSLOduration=2.80679267 podStartE2EDuration="3.283602124s" podCreationTimestamp="2025-10-03 11:45:49 +0000 UTC" firstStartedPulling="2025-10-03 11:45:50.57137719 +0000 UTC m=+7332.368009047" lastFinishedPulling="2025-10-03 11:45:51.048186644 +0000 UTC m=+7332.844818501" observedRunningTime="2025-10-03 11:45:52.278949454 +0000 UTC m=+7334.075581331" watchObservedRunningTime="2025-10-03 11:45:52.283602124 +0000 UTC m=+7334.080234001" Oct 03 11:45:53 crc kubenswrapper[4990]: I1003 11:45:53.848492 4990 scope.go:117] "RemoveContainer" containerID="b14ed541fe3d91784c6d616c02bcb786419793b57c7553c87ce6de5b5068282c" Oct 03 11:45:55 crc kubenswrapper[4990]: I1003 11:45:55.871560 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:45:55 crc kubenswrapper[4990]: E1003 11:45:55.872085 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:46:07 crc kubenswrapper[4990]: I1003 11:46:07.871882 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:46:07 crc kubenswrapper[4990]: E1003 11:46:07.872909 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:46:19 crc kubenswrapper[4990]: I1003 11:46:19.873654 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:46:19 crc kubenswrapper[4990]: E1003 11:46:19.874568 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:46:33 crc kubenswrapper[4990]: I1003 11:46:33.872793 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:46:33 crc kubenswrapper[4990]: E1003 11:46:33.875374 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.443874 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kwpmp"] Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.451081 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.453684 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwpmp"] Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.565596 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-catalog-content\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.565691 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vxlk\" (UniqueName: \"kubernetes.io/projected/22f2756c-9b72-4bbd-b597-53c5471d3249-kube-api-access-9vxlk\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.565914 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-utilities\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.667724 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-utilities\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.667866 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-catalog-content\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.667912 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vxlk\" (UniqueName: \"kubernetes.io/projected/22f2756c-9b72-4bbd-b597-53c5471d3249-kube-api-access-9vxlk\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.668430 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-catalog-content\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.668430 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-utilities\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.686883 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vxlk\" (UniqueName: \"kubernetes.io/projected/22f2756c-9b72-4bbd-b597-53c5471d3249-kube-api-access-9vxlk\") pod \"certified-operators-kwpmp\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:39 crc kubenswrapper[4990]: I1003 11:46:39.794027 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:40 crc kubenswrapper[4990]: I1003 11:46:40.367024 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwpmp"] Oct 03 11:46:40 crc kubenswrapper[4990]: W1003 11:46:40.379996 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22f2756c_9b72_4bbd_b597_53c5471d3249.slice/crio-752bc8af439d2dfcfa71dce79d4dfff74711cd24f987a51ecb45ddde8c353586 WatchSource:0}: Error finding container 752bc8af439d2dfcfa71dce79d4dfff74711cd24f987a51ecb45ddde8c353586: Status 404 returned error can't find the container with id 752bc8af439d2dfcfa71dce79d4dfff74711cd24f987a51ecb45ddde8c353586 Oct 03 11:46:40 crc kubenswrapper[4990]: I1003 11:46:40.727950 4990 generic.go:334] "Generic (PLEG): container finished" podID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerID="0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9" exitCode=0 Oct 03 11:46:40 crc kubenswrapper[4990]: I1003 11:46:40.728005 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwpmp" event={"ID":"22f2756c-9b72-4bbd-b597-53c5471d3249","Type":"ContainerDied","Data":"0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9"} Oct 03 11:46:40 crc kubenswrapper[4990]: I1003 11:46:40.728070 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwpmp" event={"ID":"22f2756c-9b72-4bbd-b597-53c5471d3249","Type":"ContainerStarted","Data":"752bc8af439d2dfcfa71dce79d4dfff74711cd24f987a51ecb45ddde8c353586"} Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.215713 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gx6gc"] Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.219186 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.225981 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gx6gc"] Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.323882 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckgh9\" (UniqueName: \"kubernetes.io/projected/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-kube-api-access-ckgh9\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.324182 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-catalog-content\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.324224 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-utilities\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.426751 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckgh9\" (UniqueName: \"kubernetes.io/projected/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-kube-api-access-ckgh9\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.426824 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-catalog-content\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.426863 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-utilities\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.427400 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-catalog-content\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.427454 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-utilities\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.448016 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckgh9\" (UniqueName: \"kubernetes.io/projected/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-kube-api-access-ckgh9\") pod \"community-operators-gx6gc\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.536497 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.748225 4990 generic.go:334] "Generic (PLEG): container finished" podID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerID="82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df" exitCode=0 Oct 03 11:46:42 crc kubenswrapper[4990]: I1003 11:46:42.748767 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwpmp" event={"ID":"22f2756c-9b72-4bbd-b597-53c5471d3249","Type":"ContainerDied","Data":"82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df"} Oct 03 11:46:43 crc kubenswrapper[4990]: I1003 11:46:43.079612 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gx6gc"] Oct 03 11:46:43 crc kubenswrapper[4990]: I1003 11:46:43.769945 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwpmp" event={"ID":"22f2756c-9b72-4bbd-b597-53c5471d3249","Type":"ContainerStarted","Data":"33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69"} Oct 03 11:46:43 crc kubenswrapper[4990]: I1003 11:46:43.772696 4990 generic.go:334] "Generic (PLEG): container finished" podID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerID="dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0" exitCode=0 Oct 03 11:46:43 crc kubenswrapper[4990]: I1003 11:46:43.772775 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx6gc" event={"ID":"30fd2185-b1eb-4797-8316-dbfc4ddddf0b","Type":"ContainerDied","Data":"dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0"} Oct 03 11:46:43 crc kubenswrapper[4990]: I1003 11:46:43.772837 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx6gc" event={"ID":"30fd2185-b1eb-4797-8316-dbfc4ddddf0b","Type":"ContainerStarted","Data":"a459a8bbf5c9e97ccfde959667a90779f33d05a583f09a9e12bdbf172581c40d"} Oct 03 11:46:43 crc kubenswrapper[4990]: I1003 11:46:43.798130 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kwpmp" podStartSLOduration=2.387990953 podStartE2EDuration="4.798108021s" podCreationTimestamp="2025-10-03 11:46:39 +0000 UTC" firstStartedPulling="2025-10-03 11:46:40.731163651 +0000 UTC m=+7382.527795508" lastFinishedPulling="2025-10-03 11:46:43.141280719 +0000 UTC m=+7384.937912576" observedRunningTime="2025-10-03 11:46:43.790455503 +0000 UTC m=+7385.587087370" watchObservedRunningTime="2025-10-03 11:46:43.798108021 +0000 UTC m=+7385.594739868" Oct 03 11:46:44 crc kubenswrapper[4990]: I1003 11:46:44.798451 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx6gc" event={"ID":"30fd2185-b1eb-4797-8316-dbfc4ddddf0b","Type":"ContainerStarted","Data":"e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae"} Oct 03 11:46:44 crc kubenswrapper[4990]: I1003 11:46:44.873380 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:46:44 crc kubenswrapper[4990]: E1003 11:46:44.873649 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:46:45 crc kubenswrapper[4990]: I1003 11:46:45.810654 4990 generic.go:334] "Generic (PLEG): container finished" podID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerID="e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae" exitCode=0 Oct 03 11:46:45 crc kubenswrapper[4990]: I1003 11:46:45.810788 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx6gc" event={"ID":"30fd2185-b1eb-4797-8316-dbfc4ddddf0b","Type":"ContainerDied","Data":"e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae"} Oct 03 11:46:46 crc kubenswrapper[4990]: I1003 11:46:46.826805 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx6gc" event={"ID":"30fd2185-b1eb-4797-8316-dbfc4ddddf0b","Type":"ContainerStarted","Data":"a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a"} Oct 03 11:46:46 crc kubenswrapper[4990]: I1003 11:46:46.851996 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gx6gc" podStartSLOduration=2.315102074 podStartE2EDuration="4.851971582s" podCreationTimestamp="2025-10-03 11:46:42 +0000 UTC" firstStartedPulling="2025-10-03 11:46:43.774384427 +0000 UTC m=+7385.571016294" lastFinishedPulling="2025-10-03 11:46:46.311253945 +0000 UTC m=+7388.107885802" observedRunningTime="2025-10-03 11:46:46.845107935 +0000 UTC m=+7388.641739802" watchObservedRunningTime="2025-10-03 11:46:46.851971582 +0000 UTC m=+7388.648603449" Oct 03 11:46:49 crc kubenswrapper[4990]: I1003 11:46:49.794612 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:49 crc kubenswrapper[4990]: I1003 11:46:49.795259 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:49 crc kubenswrapper[4990]: I1003 11:46:49.867112 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:49 crc kubenswrapper[4990]: I1003 11:46:49.922313 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:51 crc kubenswrapper[4990]: I1003 11:46:51.001823 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kwpmp"] Oct 03 11:46:51 crc kubenswrapper[4990]: I1003 11:46:51.875750 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kwpmp" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerName="registry-server" containerID="cri-o://33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69" gracePeriod=2 Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.386885 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.468661 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-catalog-content\") pod \"22f2756c-9b72-4bbd-b597-53c5471d3249\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.468889 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vxlk\" (UniqueName: \"kubernetes.io/projected/22f2756c-9b72-4bbd-b597-53c5471d3249-kube-api-access-9vxlk\") pod \"22f2756c-9b72-4bbd-b597-53c5471d3249\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.468978 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-utilities\") pod \"22f2756c-9b72-4bbd-b597-53c5471d3249\" (UID: \"22f2756c-9b72-4bbd-b597-53c5471d3249\") " Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.470826 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-utilities" (OuterVolumeSpecName: "utilities") pod "22f2756c-9b72-4bbd-b597-53c5471d3249" (UID: "22f2756c-9b72-4bbd-b597-53c5471d3249"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.471351 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.483719 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f2756c-9b72-4bbd-b597-53c5471d3249-kube-api-access-9vxlk" (OuterVolumeSpecName: "kube-api-access-9vxlk") pod "22f2756c-9b72-4bbd-b597-53c5471d3249" (UID: "22f2756c-9b72-4bbd-b597-53c5471d3249"). InnerVolumeSpecName "kube-api-access-9vxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.532280 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22f2756c-9b72-4bbd-b597-53c5471d3249" (UID: "22f2756c-9b72-4bbd-b597-53c5471d3249"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.536634 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.537430 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.573202 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f2756c-9b72-4bbd-b597-53c5471d3249-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.573245 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vxlk\" (UniqueName: \"kubernetes.io/projected/22f2756c-9b72-4bbd-b597-53c5471d3249-kube-api-access-9vxlk\") on node \"crc\" DevicePath \"\"" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.602969 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.905785 4990 generic.go:334] "Generic (PLEG): container finished" podID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerID="33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69" exitCode=0 Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.905858 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwpmp" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.905850 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwpmp" event={"ID":"22f2756c-9b72-4bbd-b597-53c5471d3249","Type":"ContainerDied","Data":"33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69"} Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.905930 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwpmp" event={"ID":"22f2756c-9b72-4bbd-b597-53c5471d3249","Type":"ContainerDied","Data":"752bc8af439d2dfcfa71dce79d4dfff74711cd24f987a51ecb45ddde8c353586"} Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.905955 4990 scope.go:117] "RemoveContainer" containerID="33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.938189 4990 scope.go:117] "RemoveContainer" containerID="82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.948421 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kwpmp"] Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.957405 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kwpmp"] Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.968545 4990 scope.go:117] "RemoveContainer" containerID="0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9" Oct 03 11:46:52 crc kubenswrapper[4990]: I1003 11:46:52.979980 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:53 crc kubenswrapper[4990]: I1003 11:46:53.030211 4990 scope.go:117] "RemoveContainer" containerID="33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69" Oct 03 11:46:53 crc kubenswrapper[4990]: E1003 11:46:53.030915 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69\": container with ID starting with 33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69 not found: ID does not exist" containerID="33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69" Oct 03 11:46:53 crc kubenswrapper[4990]: I1003 11:46:53.030965 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69"} err="failed to get container status \"33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69\": rpc error: code = NotFound desc = could not find container \"33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69\": container with ID starting with 33fc8326eec363257cb13e3adf59c34baf2100339c77f2290e69e3f8c604fa69 not found: ID does not exist" Oct 03 11:46:53 crc kubenswrapper[4990]: I1003 11:46:53.031009 4990 scope.go:117] "RemoveContainer" containerID="82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df" Oct 03 11:46:53 crc kubenswrapper[4990]: E1003 11:46:53.032170 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df\": container with ID starting with 82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df not found: ID does not exist" containerID="82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df" Oct 03 11:46:53 crc kubenswrapper[4990]: I1003 11:46:53.032208 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df"} err="failed to get container status \"82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df\": rpc error: code = NotFound desc = could not find container \"82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df\": container with ID starting with 82e6112661b3985218554cc2b7aad4565a709cfc15c9e925edfd6b47433176df not found: ID does not exist" Oct 03 11:46:53 crc kubenswrapper[4990]: I1003 11:46:53.032234 4990 scope.go:117] "RemoveContainer" containerID="0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9" Oct 03 11:46:53 crc kubenswrapper[4990]: E1003 11:46:53.032666 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9\": container with ID starting with 0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9 not found: ID does not exist" containerID="0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9" Oct 03 11:46:53 crc kubenswrapper[4990]: I1003 11:46:53.032709 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9"} err="failed to get container status \"0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9\": rpc error: code = NotFound desc = could not find container \"0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9\": container with ID starting with 0630c187bb6797e822ee2f5174599fcf27715bc7ce16ebd69894f4cd287d37d9 not found: ID does not exist" Oct 03 11:46:54 crc kubenswrapper[4990]: I1003 11:46:54.887180 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" path="/var/lib/kubelet/pods/22f2756c-9b72-4bbd-b597-53c5471d3249/volumes" Oct 03 11:46:54 crc kubenswrapper[4990]: I1003 11:46:54.998869 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gx6gc"] Oct 03 11:46:54 crc kubenswrapper[4990]: I1003 11:46:54.999096 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gx6gc" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerName="registry-server" containerID="cri-o://a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a" gracePeriod=2 Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.507840 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.545841 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-catalog-content\") pod \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.546029 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-utilities\") pod \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.546168 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckgh9\" (UniqueName: \"kubernetes.io/projected/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-kube-api-access-ckgh9\") pod \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\" (UID: \"30fd2185-b1eb-4797-8316-dbfc4ddddf0b\") " Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.550614 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-utilities" (OuterVolumeSpecName: "utilities") pod "30fd2185-b1eb-4797-8316-dbfc4ddddf0b" (UID: "30fd2185-b1eb-4797-8316-dbfc4ddddf0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.563544 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-kube-api-access-ckgh9" (OuterVolumeSpecName: "kube-api-access-ckgh9") pod "30fd2185-b1eb-4797-8316-dbfc4ddddf0b" (UID: "30fd2185-b1eb-4797-8316-dbfc4ddddf0b"). InnerVolumeSpecName "kube-api-access-ckgh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.621276 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30fd2185-b1eb-4797-8316-dbfc4ddddf0b" (UID: "30fd2185-b1eb-4797-8316-dbfc4ddddf0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.649053 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.649085 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckgh9\" (UniqueName: \"kubernetes.io/projected/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-kube-api-access-ckgh9\") on node \"crc\" DevicePath \"\"" Oct 03 11:46:55 crc kubenswrapper[4990]: I1003 11:46:55.649095 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30fd2185-b1eb-4797-8316-dbfc4ddddf0b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.018479 4990 generic.go:334] "Generic (PLEG): container finished" podID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerID="a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a" exitCode=0 Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.018752 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gx6gc" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.018781 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx6gc" event={"ID":"30fd2185-b1eb-4797-8316-dbfc4ddddf0b","Type":"ContainerDied","Data":"a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a"} Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.022678 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gx6gc" event={"ID":"30fd2185-b1eb-4797-8316-dbfc4ddddf0b","Type":"ContainerDied","Data":"a459a8bbf5c9e97ccfde959667a90779f33d05a583f09a9e12bdbf172581c40d"} Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.022721 4990 scope.go:117] "RemoveContainer" containerID="a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.071058 4990 scope.go:117] "RemoveContainer" containerID="e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.106747 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gx6gc"] Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.160202 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gx6gc"] Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.204732 4990 scope.go:117] "RemoveContainer" containerID="dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.253442 4990 scope.go:117] "RemoveContainer" containerID="a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a" Oct 03 11:46:56 crc kubenswrapper[4990]: E1003 11:46:56.253929 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a\": container with ID starting with a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a not found: ID does not exist" containerID="a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.253960 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a"} err="failed to get container status \"a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a\": rpc error: code = NotFound desc = could not find container \"a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a\": container with ID starting with a098d06933b0ef9e9b551be9e3d16c9f8d19bc358dacc9a97829494fd6c3530a not found: ID does not exist" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.253979 4990 scope.go:117] "RemoveContainer" containerID="e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae" Oct 03 11:46:56 crc kubenswrapper[4990]: E1003 11:46:56.254498 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae\": container with ID starting with e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae not found: ID does not exist" containerID="e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.254536 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae"} err="failed to get container status \"e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae\": rpc error: code = NotFound desc = could not find container \"e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae\": container with ID starting with e56e66ed93eaf061a6095d6e4cbbd1814b28ea7706b584b2d4c50f9733cd24ae not found: ID does not exist" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.254552 4990 scope.go:117] "RemoveContainer" containerID="dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0" Oct 03 11:46:56 crc kubenswrapper[4990]: E1003 11:46:56.254827 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0\": container with ID starting with dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0 not found: ID does not exist" containerID="dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.254847 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0"} err="failed to get container status \"dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0\": rpc error: code = NotFound desc = could not find container \"dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0\": container with ID starting with dc5078a8e2e5bc6a7c5ab9c908a479ee5228eee82b0a7ce1b2ce8f9c6273d8e0 not found: ID does not exist" Oct 03 11:46:56 crc kubenswrapper[4990]: I1003 11:46:56.885830 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" path="/var/lib/kubelet/pods/30fd2185-b1eb-4797-8316-dbfc4ddddf0b/volumes" Oct 03 11:46:59 crc kubenswrapper[4990]: I1003 11:46:59.871951 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:47:01 crc kubenswrapper[4990]: I1003 11:47:01.087784 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"836bfd8bd420440c04f038b28dec8dead6c507dc06e80c463c36bc55b27c3c1d"} Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.777014 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fb8xv"] Oct 03 11:48:50 crc kubenswrapper[4990]: E1003 11:48:50.778056 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerName="extract-content" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.778070 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerName="extract-content" Oct 03 11:48:50 crc kubenswrapper[4990]: E1003 11:48:50.778083 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerName="registry-server" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.778089 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerName="registry-server" Oct 03 11:48:50 crc kubenswrapper[4990]: E1003 11:48:50.778102 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerName="extract-utilities" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.778108 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerName="extract-utilities" Oct 03 11:48:50 crc kubenswrapper[4990]: E1003 11:48:50.778126 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerName="extract-content" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.778173 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerName="extract-content" Oct 03 11:48:50 crc kubenswrapper[4990]: E1003 11:48:50.778188 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerName="registry-server" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.778194 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerName="registry-server" Oct 03 11:48:50 crc kubenswrapper[4990]: E1003 11:48:50.778220 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerName="extract-utilities" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.778226 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerName="extract-utilities" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.778403 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="30fd2185-b1eb-4797-8316-dbfc4ddddf0b" containerName="registry-server" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.778415 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f2756c-9b72-4bbd-b597-53c5471d3249" containerName="registry-server" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.780073 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.805016 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fb8xv"] Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.846690 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmtdf\" (UniqueName: \"kubernetes.io/projected/c56bace2-e3b8-41c3-8296-d21a4ae58be8-kube-api-access-gmtdf\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.846856 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-utilities\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.846894 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-catalog-content\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.947996 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-utilities\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.948057 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-catalog-content\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.948168 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmtdf\" (UniqueName: \"kubernetes.io/projected/c56bace2-e3b8-41c3-8296-d21a4ae58be8-kube-api-access-gmtdf\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.948734 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-catalog-content\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.949484 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-utilities\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:50 crc kubenswrapper[4990]: I1003 11:48:50.978062 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmtdf\" (UniqueName: \"kubernetes.io/projected/c56bace2-e3b8-41c3-8296-d21a4ae58be8-kube-api-access-gmtdf\") pod \"redhat-operators-fb8xv\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:51 crc kubenswrapper[4990]: I1003 11:48:51.120169 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:48:51 crc kubenswrapper[4990]: I1003 11:48:51.647592 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fb8xv"] Oct 03 11:48:51 crc kubenswrapper[4990]: W1003 11:48:51.653699 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc56bace2_e3b8_41c3_8296_d21a4ae58be8.slice/crio-ffcf8a7424760775e426c5a5fc1a8766b03e54ebf2865f530afce6b5968a20fc WatchSource:0}: Error finding container ffcf8a7424760775e426c5a5fc1a8766b03e54ebf2865f530afce6b5968a20fc: Status 404 returned error can't find the container with id ffcf8a7424760775e426c5a5fc1a8766b03e54ebf2865f530afce6b5968a20fc Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.165068 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-59zxh"] Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.169017 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.176210 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-59zxh"] Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.277834 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjbln\" (UniqueName: \"kubernetes.io/projected/82eb5b0d-37f0-4d07-9697-d097551c5a13-kube-api-access-kjbln\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.277917 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-catalog-content\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.278151 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-utilities\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.316197 4990 generic.go:334] "Generic (PLEG): container finished" podID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerID="73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655" exitCode=0 Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.316249 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fb8xv" event={"ID":"c56bace2-e3b8-41c3-8296-d21a4ae58be8","Type":"ContainerDied","Data":"73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655"} Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.316278 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fb8xv" event={"ID":"c56bace2-e3b8-41c3-8296-d21a4ae58be8","Type":"ContainerStarted","Data":"ffcf8a7424760775e426c5a5fc1a8766b03e54ebf2865f530afce6b5968a20fc"} Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.379909 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjbln\" (UniqueName: \"kubernetes.io/projected/82eb5b0d-37f0-4d07-9697-d097551c5a13-kube-api-access-kjbln\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.379956 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-catalog-content\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.380044 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-utilities\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.383110 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-catalog-content\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.383182 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-utilities\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.402147 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjbln\" (UniqueName: \"kubernetes.io/projected/82eb5b0d-37f0-4d07-9697-d097551c5a13-kube-api-access-kjbln\") pod \"redhat-marketplace-59zxh\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:52 crc kubenswrapper[4990]: I1003 11:48:52.497918 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:48:53 crc kubenswrapper[4990]: I1003 11:48:53.021722 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-59zxh"] Oct 03 11:48:53 crc kubenswrapper[4990]: W1003 11:48:53.026337 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82eb5b0d_37f0_4d07_9697_d097551c5a13.slice/crio-008e46641a00c5069c33d4ab14d167e23963f448505b8aee7566aa35864dc636 WatchSource:0}: Error finding container 008e46641a00c5069c33d4ab14d167e23963f448505b8aee7566aa35864dc636: Status 404 returned error can't find the container with id 008e46641a00c5069c33d4ab14d167e23963f448505b8aee7566aa35864dc636 Oct 03 11:48:53 crc kubenswrapper[4990]: I1003 11:48:53.331322 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fb8xv" event={"ID":"c56bace2-e3b8-41c3-8296-d21a4ae58be8","Type":"ContainerStarted","Data":"2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49"} Oct 03 11:48:53 crc kubenswrapper[4990]: I1003 11:48:53.333795 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59zxh" event={"ID":"82eb5b0d-37f0-4d07-9697-d097551c5a13","Type":"ContainerDied","Data":"f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8"} Oct 03 11:48:53 crc kubenswrapper[4990]: I1003 11:48:53.333134 4990 generic.go:334] "Generic (PLEG): container finished" podID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerID="f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8" exitCode=0 Oct 03 11:48:53 crc kubenswrapper[4990]: I1003 11:48:53.334042 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59zxh" event={"ID":"82eb5b0d-37f0-4d07-9697-d097551c5a13","Type":"ContainerStarted","Data":"008e46641a00c5069c33d4ab14d167e23963f448505b8aee7566aa35864dc636"} Oct 03 11:48:54 crc kubenswrapper[4990]: I1003 11:48:54.344887 4990 generic.go:334] "Generic (PLEG): container finished" podID="ddc5184e-8071-42ec-85d3-2bf5eba352ab" containerID="b36c3bf2537d217d066cb5640d749dd9deaf75d47a1d1e8671aaa2862ea6742d" exitCode=0 Oct 03 11:48:54 crc kubenswrapper[4990]: I1003 11:48:54.344968 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" event={"ID":"ddc5184e-8071-42ec-85d3-2bf5eba352ab","Type":"ContainerDied","Data":"b36c3bf2537d217d066cb5640d749dd9deaf75d47a1d1e8671aaa2862ea6742d"} Oct 03 11:48:54 crc kubenswrapper[4990]: I1003 11:48:54.347421 4990 generic.go:334] "Generic (PLEG): container finished" podID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerID="2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49" exitCode=0 Oct 03 11:48:54 crc kubenswrapper[4990]: I1003 11:48:54.347449 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fb8xv" event={"ID":"c56bace2-e3b8-41c3-8296-d21a4ae58be8","Type":"ContainerDied","Data":"2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49"} Oct 03 11:48:55 crc kubenswrapper[4990]: I1003 11:48:55.359601 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fb8xv" event={"ID":"c56bace2-e3b8-41c3-8296-d21a4ae58be8","Type":"ContainerStarted","Data":"57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc"} Oct 03 11:48:55 crc kubenswrapper[4990]: I1003 11:48:55.363580 4990 generic.go:334] "Generic (PLEG): container finished" podID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerID="17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2" exitCode=0 Oct 03 11:48:55 crc kubenswrapper[4990]: I1003 11:48:55.363719 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59zxh" event={"ID":"82eb5b0d-37f0-4d07-9697-d097551c5a13","Type":"ContainerDied","Data":"17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2"} Oct 03 11:48:55 crc kubenswrapper[4990]: I1003 11:48:55.397165 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fb8xv" podStartSLOduration=2.940962905 podStartE2EDuration="5.397144546s" podCreationTimestamp="2025-10-03 11:48:50 +0000 UTC" firstStartedPulling="2025-10-03 11:48:52.318478932 +0000 UTC m=+7514.115110809" lastFinishedPulling="2025-10-03 11:48:54.774660593 +0000 UTC m=+7516.571292450" observedRunningTime="2025-10-03 11:48:55.38799226 +0000 UTC m=+7517.184624117" watchObservedRunningTime="2025-10-03 11:48:55.397144546 +0000 UTC m=+7517.193776413" Oct 03 11:48:55 crc kubenswrapper[4990]: I1003 11:48:55.877579 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.062223 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5rq\" (UniqueName: \"kubernetes.io/projected/ddc5184e-8071-42ec-85d3-2bf5eba352ab-kube-api-access-lc5rq\") pod \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.062655 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-ssh-key\") pod \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.062715 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-inventory\") pod \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.062817 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-bootstrap-combined-ca-bundle\") pod \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\" (UID: \"ddc5184e-8071-42ec-85d3-2bf5eba352ab\") " Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.068369 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc5184e-8071-42ec-85d3-2bf5eba352ab-kube-api-access-lc5rq" (OuterVolumeSpecName: "kube-api-access-lc5rq") pod "ddc5184e-8071-42ec-85d3-2bf5eba352ab" (UID: "ddc5184e-8071-42ec-85d3-2bf5eba352ab"). InnerVolumeSpecName "kube-api-access-lc5rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.077833 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ddc5184e-8071-42ec-85d3-2bf5eba352ab" (UID: "ddc5184e-8071-42ec-85d3-2bf5eba352ab"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.099576 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-inventory" (OuterVolumeSpecName: "inventory") pod "ddc5184e-8071-42ec-85d3-2bf5eba352ab" (UID: "ddc5184e-8071-42ec-85d3-2bf5eba352ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.100310 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ddc5184e-8071-42ec-85d3-2bf5eba352ab" (UID: "ddc5184e-8071-42ec-85d3-2bf5eba352ab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.165263 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.165299 4990 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.165311 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5rq\" (UniqueName: \"kubernetes.io/projected/ddc5184e-8071-42ec-85d3-2bf5eba352ab-kube-api-access-lc5rq\") on node \"crc\" DevicePath \"\"" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.165321 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddc5184e-8071-42ec-85d3-2bf5eba352ab-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.377271 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" event={"ID":"ddc5184e-8071-42ec-85d3-2bf5eba352ab","Type":"ContainerDied","Data":"12b3ba3b7797e73daf965cd419106001eac93962343a7c63951a134abee9a59c"} Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.377328 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b3ba3b7797e73daf965cd419106001eac93962343a7c63951a134abee9a59c" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.377297 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-9nksq" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.381151 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59zxh" event={"ID":"82eb5b0d-37f0-4d07-9697-d097551c5a13","Type":"ContainerStarted","Data":"a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d"} Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.419827 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-59zxh" podStartSLOduration=1.980050536 podStartE2EDuration="4.419805251s" podCreationTimestamp="2025-10-03 11:48:52 +0000 UTC" firstStartedPulling="2025-10-03 11:48:53.335310837 +0000 UTC m=+7515.131942694" lastFinishedPulling="2025-10-03 11:48:55.775065552 +0000 UTC m=+7517.571697409" observedRunningTime="2025-10-03 11:48:56.410186352 +0000 UTC m=+7518.206818209" watchObservedRunningTime="2025-10-03 11:48:56.419805251 +0000 UTC m=+7518.216437118" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.497876 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-zvplg"] Oct 03 11:48:56 crc kubenswrapper[4990]: E1003 11:48:56.498774 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc5184e-8071-42ec-85d3-2bf5eba352ab" containerName="bootstrap-openstack-openstack-cell1" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.498792 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc5184e-8071-42ec-85d3-2bf5eba352ab" containerName="bootstrap-openstack-openstack-cell1" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.499059 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc5184e-8071-42ec-85d3-2bf5eba352ab" containerName="bootstrap-openstack-openstack-cell1" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.500064 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.503002 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.503436 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.503600 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.503710 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.508749 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-zvplg"] Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.677766 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwgv9\" (UniqueName: \"kubernetes.io/projected/29d6d8d9-57b8-4256-b831-940e5798ab5c-kube-api-access-bwgv9\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.677966 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-ssh-key\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.678023 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-inventory\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.780825 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-ssh-key\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.780895 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-inventory\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.781026 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwgv9\" (UniqueName: \"kubernetes.io/projected/29d6d8d9-57b8-4256-b831-940e5798ab5c-kube-api-access-bwgv9\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.785019 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-inventory\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.785962 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-ssh-key\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.798729 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwgv9\" (UniqueName: \"kubernetes.io/projected/29d6d8d9-57b8-4256-b831-940e5798ab5c-kube-api-access-bwgv9\") pod \"download-cache-openstack-openstack-cell1-zvplg\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:56 crc kubenswrapper[4990]: I1003 11:48:56.833744 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:48:57 crc kubenswrapper[4990]: I1003 11:48:57.435045 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-zvplg"] Oct 03 11:48:57 crc kubenswrapper[4990]: W1003 11:48:57.441713 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29d6d8d9_57b8_4256_b831_940e5798ab5c.slice/crio-2afc590401dda4aaf1d0760d38ece5d3197df2473086b4600765ecfd71f88afc WatchSource:0}: Error finding container 2afc590401dda4aaf1d0760d38ece5d3197df2473086b4600765ecfd71f88afc: Status 404 returned error can't find the container with id 2afc590401dda4aaf1d0760d38ece5d3197df2473086b4600765ecfd71f88afc Oct 03 11:48:58 crc kubenswrapper[4990]: I1003 11:48:58.402846 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" event={"ID":"29d6d8d9-57b8-4256-b831-940e5798ab5c","Type":"ContainerStarted","Data":"6cd7123c1f03894f0ed5ea763324aa2d575541eb3a7db29be1988c5f1b2d5f47"} Oct 03 11:48:58 crc kubenswrapper[4990]: I1003 11:48:58.403377 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" event={"ID":"29d6d8d9-57b8-4256-b831-940e5798ab5c","Type":"ContainerStarted","Data":"2afc590401dda4aaf1d0760d38ece5d3197df2473086b4600765ecfd71f88afc"} Oct 03 11:48:58 crc kubenswrapper[4990]: I1003 11:48:58.422916 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" podStartSLOduration=1.954134093 podStartE2EDuration="2.42289235s" podCreationTimestamp="2025-10-03 11:48:56 +0000 UTC" firstStartedPulling="2025-10-03 11:48:57.444830458 +0000 UTC m=+7519.241462325" lastFinishedPulling="2025-10-03 11:48:57.913588695 +0000 UTC m=+7519.710220582" observedRunningTime="2025-10-03 11:48:58.420774426 +0000 UTC m=+7520.217406293" watchObservedRunningTime="2025-10-03 11:48:58.42289235 +0000 UTC m=+7520.219524207" Oct 03 11:49:01 crc kubenswrapper[4990]: I1003 11:49:01.120892 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:49:01 crc kubenswrapper[4990]: I1003 11:49:01.121295 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:49:01 crc kubenswrapper[4990]: I1003 11:49:01.179439 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:49:01 crc kubenswrapper[4990]: I1003 11:49:01.514721 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:49:01 crc kubenswrapper[4990]: I1003 11:49:01.562912 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fb8xv"] Oct 03 11:49:02 crc kubenswrapper[4990]: I1003 11:49:02.498638 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:49:02 crc kubenswrapper[4990]: I1003 11:49:02.499033 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:49:02 crc kubenswrapper[4990]: I1003 11:49:02.595304 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:49:03 crc kubenswrapper[4990]: I1003 11:49:03.458813 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fb8xv" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerName="registry-server" containerID="cri-o://57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc" gracePeriod=2 Oct 03 11:49:03 crc kubenswrapper[4990]: I1003 11:49:03.509977 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:49:03 crc kubenswrapper[4990]: I1003 11:49:03.828611 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-59zxh"] Oct 03 11:49:03 crc kubenswrapper[4990]: I1003 11:49:03.952782 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.046137 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmtdf\" (UniqueName: \"kubernetes.io/projected/c56bace2-e3b8-41c3-8296-d21a4ae58be8-kube-api-access-gmtdf\") pod \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.046361 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-catalog-content\") pod \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.046397 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-utilities\") pod \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\" (UID: \"c56bace2-e3b8-41c3-8296-d21a4ae58be8\") " Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.047863 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-utilities" (OuterVolumeSpecName: "utilities") pod "c56bace2-e3b8-41c3-8296-d21a4ae58be8" (UID: "c56bace2-e3b8-41c3-8296-d21a4ae58be8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.051772 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56bace2-e3b8-41c3-8296-d21a4ae58be8-kube-api-access-gmtdf" (OuterVolumeSpecName: "kube-api-access-gmtdf") pod "c56bace2-e3b8-41c3-8296-d21a4ae58be8" (UID: "c56bace2-e3b8-41c3-8296-d21a4ae58be8"). InnerVolumeSpecName "kube-api-access-gmtdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.148914 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.148939 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmtdf\" (UniqueName: \"kubernetes.io/projected/c56bace2-e3b8-41c3-8296-d21a4ae58be8-kube-api-access-gmtdf\") on node \"crc\" DevicePath \"\"" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.281037 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c56bace2-e3b8-41c3-8296-d21a4ae58be8" (UID: "c56bace2-e3b8-41c3-8296-d21a4ae58be8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.354472 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c56bace2-e3b8-41c3-8296-d21a4ae58be8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.469670 4990 generic.go:334] "Generic (PLEG): container finished" podID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerID="57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc" exitCode=0 Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.469754 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fb8xv" event={"ID":"c56bace2-e3b8-41c3-8296-d21a4ae58be8","Type":"ContainerDied","Data":"57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc"} Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.469786 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fb8xv" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.469855 4990 scope.go:117] "RemoveContainer" containerID="57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.469837 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fb8xv" event={"ID":"c56bace2-e3b8-41c3-8296-d21a4ae58be8","Type":"ContainerDied","Data":"ffcf8a7424760775e426c5a5fc1a8766b03e54ebf2865f530afce6b5968a20fc"} Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.494130 4990 scope.go:117] "RemoveContainer" containerID="2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.519723 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fb8xv"] Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.529484 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fb8xv"] Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.550134 4990 scope.go:117] "RemoveContainer" containerID="73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.574641 4990 scope.go:117] "RemoveContainer" containerID="57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc" Oct 03 11:49:04 crc kubenswrapper[4990]: E1003 11:49:04.575155 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc\": container with ID starting with 57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc not found: ID does not exist" containerID="57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.575218 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc"} err="failed to get container status \"57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc\": rpc error: code = NotFound desc = could not find container \"57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc\": container with ID starting with 57c0c22258a15587c6947af633bd11a8a404515f394eea21dd46b48771a3d4cc not found: ID does not exist" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.575255 4990 scope.go:117] "RemoveContainer" containerID="2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49" Oct 03 11:49:04 crc kubenswrapper[4990]: E1003 11:49:04.575601 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49\": container with ID starting with 2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49 not found: ID does not exist" containerID="2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.575648 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49"} err="failed to get container status \"2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49\": rpc error: code = NotFound desc = could not find container \"2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49\": container with ID starting with 2dcf0ef28d75c9909a72ef5c2555150c0a38c60b353875ed6aaaa0de863dff49 not found: ID does not exist" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.575674 4990 scope.go:117] "RemoveContainer" containerID="73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655" Oct 03 11:49:04 crc kubenswrapper[4990]: E1003 11:49:04.575956 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655\": container with ID starting with 73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655 not found: ID does not exist" containerID="73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.576001 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655"} err="failed to get container status \"73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655\": rpc error: code = NotFound desc = could not find container \"73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655\": container with ID starting with 73b06707a104d70ee694118effb7a1de37d5e8709091c42f4c1e0b4fcc700655 not found: ID does not exist" Oct 03 11:49:04 crc kubenswrapper[4990]: I1003 11:49:04.885447 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" path="/var/lib/kubelet/pods/c56bace2-e3b8-41c3-8296-d21a4ae58be8/volumes" Oct 03 11:49:05 crc kubenswrapper[4990]: I1003 11:49:05.482707 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-59zxh" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerName="registry-server" containerID="cri-o://a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d" gracePeriod=2 Oct 03 11:49:05 crc kubenswrapper[4990]: I1003 11:49:05.951690 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.091325 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-utilities\") pod \"82eb5b0d-37f0-4d07-9697-d097551c5a13\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.091430 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjbln\" (UniqueName: \"kubernetes.io/projected/82eb5b0d-37f0-4d07-9697-d097551c5a13-kube-api-access-kjbln\") pod \"82eb5b0d-37f0-4d07-9697-d097551c5a13\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.091608 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-catalog-content\") pod \"82eb5b0d-37f0-4d07-9697-d097551c5a13\" (UID: \"82eb5b0d-37f0-4d07-9697-d097551c5a13\") " Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.092409 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-utilities" (OuterVolumeSpecName: "utilities") pod "82eb5b0d-37f0-4d07-9697-d097551c5a13" (UID: "82eb5b0d-37f0-4d07-9697-d097551c5a13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.098861 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82eb5b0d-37f0-4d07-9697-d097551c5a13-kube-api-access-kjbln" (OuterVolumeSpecName: "kube-api-access-kjbln") pod "82eb5b0d-37f0-4d07-9697-d097551c5a13" (UID: "82eb5b0d-37f0-4d07-9697-d097551c5a13"). InnerVolumeSpecName "kube-api-access-kjbln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.112144 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82eb5b0d-37f0-4d07-9697-d097551c5a13" (UID: "82eb5b0d-37f0-4d07-9697-d097551c5a13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.194401 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.194450 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82eb5b0d-37f0-4d07-9697-d097551c5a13-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.194464 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjbln\" (UniqueName: \"kubernetes.io/projected/82eb5b0d-37f0-4d07-9697-d097551c5a13-kube-api-access-kjbln\") on node \"crc\" DevicePath \"\"" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.497847 4990 generic.go:334] "Generic (PLEG): container finished" podID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerID="a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d" exitCode=0 Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.497924 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-59zxh" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.497951 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59zxh" event={"ID":"82eb5b0d-37f0-4d07-9697-d097551c5a13","Type":"ContainerDied","Data":"a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d"} Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.498753 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-59zxh" event={"ID":"82eb5b0d-37f0-4d07-9697-d097551c5a13","Type":"ContainerDied","Data":"008e46641a00c5069c33d4ab14d167e23963f448505b8aee7566aa35864dc636"} Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.498805 4990 scope.go:117] "RemoveContainer" containerID="a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.526010 4990 scope.go:117] "RemoveContainer" containerID="17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.566618 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-59zxh"] Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.582251 4990 scope.go:117] "RemoveContainer" containerID="f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.584704 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-59zxh"] Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.621805 4990 scope.go:117] "RemoveContainer" containerID="a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d" Oct 03 11:49:06 crc kubenswrapper[4990]: E1003 11:49:06.622630 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d\": container with ID starting with a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d not found: ID does not exist" containerID="a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.622720 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d"} err="failed to get container status \"a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d\": rpc error: code = NotFound desc = could not find container \"a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d\": container with ID starting with a4b619f89cc4ab6228df52d3d451f231f89cc7bfd953af870911b59c8165493d not found: ID does not exist" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.622794 4990 scope.go:117] "RemoveContainer" containerID="17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2" Oct 03 11:49:06 crc kubenswrapper[4990]: E1003 11:49:06.623255 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2\": container with ID starting with 17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2 not found: ID does not exist" containerID="17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.623385 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2"} err="failed to get container status \"17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2\": rpc error: code = NotFound desc = could not find container \"17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2\": container with ID starting with 17981ee53542af1fb488cb771ead69bde07b787cd92e7cf953b62d0f218342b2 not found: ID does not exist" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.623484 4990 scope.go:117] "RemoveContainer" containerID="f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8" Oct 03 11:49:06 crc kubenswrapper[4990]: E1003 11:49:06.623914 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8\": container with ID starting with f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8 not found: ID does not exist" containerID="f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.623948 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8"} err="failed to get container status \"f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8\": rpc error: code = NotFound desc = could not find container \"f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8\": container with ID starting with f87f1970a2bff4fd86bd7aa775595a47c7aec3dba9e6227f28a5fc0f25af21f8 not found: ID does not exist" Oct 03 11:49:06 crc kubenswrapper[4990]: I1003 11:49:06.886885 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" path="/var/lib/kubelet/pods/82eb5b0d-37f0-4d07-9697-d097551c5a13/volumes" Oct 03 11:49:25 crc kubenswrapper[4990]: I1003 11:49:25.303901 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:49:25 crc kubenswrapper[4990]: I1003 11:49:25.304601 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:49:55 crc kubenswrapper[4990]: I1003 11:49:55.304381 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:49:55 crc kubenswrapper[4990]: I1003 11:49:55.305226 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:50:25 crc kubenswrapper[4990]: I1003 11:50:25.303725 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:50:25 crc kubenswrapper[4990]: I1003 11:50:25.304525 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:50:25 crc kubenswrapper[4990]: I1003 11:50:25.304791 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:50:25 crc kubenswrapper[4990]: I1003 11:50:25.305905 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"836bfd8bd420440c04f038b28dec8dead6c507dc06e80c463c36bc55b27c3c1d"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:50:25 crc kubenswrapper[4990]: I1003 11:50:25.305986 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://836bfd8bd420440c04f038b28dec8dead6c507dc06e80c463c36bc55b27c3c1d" gracePeriod=600 Oct 03 11:50:26 crc kubenswrapper[4990]: I1003 11:50:26.387622 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="836bfd8bd420440c04f038b28dec8dead6c507dc06e80c463c36bc55b27c3c1d" exitCode=0 Oct 03 11:50:26 crc kubenswrapper[4990]: I1003 11:50:26.387697 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"836bfd8bd420440c04f038b28dec8dead6c507dc06e80c463c36bc55b27c3c1d"} Oct 03 11:50:26 crc kubenswrapper[4990]: I1003 11:50:26.388225 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb"} Oct 03 11:50:26 crc kubenswrapper[4990]: I1003 11:50:26.388267 4990 scope.go:117] "RemoveContainer" containerID="396c41c84114f465a89c0a6d3308f2cab34dec90e902fe8eeb78970a389ebba1" Oct 03 11:50:27 crc kubenswrapper[4990]: I1003 11:50:27.407107 4990 generic.go:334] "Generic (PLEG): container finished" podID="29d6d8d9-57b8-4256-b831-940e5798ab5c" containerID="6cd7123c1f03894f0ed5ea763324aa2d575541eb3a7db29be1988c5f1b2d5f47" exitCode=0 Oct 03 11:50:27 crc kubenswrapper[4990]: I1003 11:50:27.407220 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" event={"ID":"29d6d8d9-57b8-4256-b831-940e5798ab5c","Type":"ContainerDied","Data":"6cd7123c1f03894f0ed5ea763324aa2d575541eb3a7db29be1988c5f1b2d5f47"} Oct 03 11:50:28 crc kubenswrapper[4990]: I1003 11:50:28.862579 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:50:28 crc kubenswrapper[4990]: I1003 11:50:28.932272 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwgv9\" (UniqueName: \"kubernetes.io/projected/29d6d8d9-57b8-4256-b831-940e5798ab5c-kube-api-access-bwgv9\") pod \"29d6d8d9-57b8-4256-b831-940e5798ab5c\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " Oct 03 11:50:28 crc kubenswrapper[4990]: I1003 11:50:28.932574 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-inventory\") pod \"29d6d8d9-57b8-4256-b831-940e5798ab5c\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " Oct 03 11:50:28 crc kubenswrapper[4990]: I1003 11:50:28.932817 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-ssh-key\") pod \"29d6d8d9-57b8-4256-b831-940e5798ab5c\" (UID: \"29d6d8d9-57b8-4256-b831-940e5798ab5c\") " Oct 03 11:50:28 crc kubenswrapper[4990]: I1003 11:50:28.940738 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d6d8d9-57b8-4256-b831-940e5798ab5c-kube-api-access-bwgv9" (OuterVolumeSpecName: "kube-api-access-bwgv9") pod "29d6d8d9-57b8-4256-b831-940e5798ab5c" (UID: "29d6d8d9-57b8-4256-b831-940e5798ab5c"). InnerVolumeSpecName "kube-api-access-bwgv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:50:28 crc kubenswrapper[4990]: I1003 11:50:28.977937 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29d6d8d9-57b8-4256-b831-940e5798ab5c" (UID: "29d6d8d9-57b8-4256-b831-940e5798ab5c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:50:28 crc kubenswrapper[4990]: I1003 11:50:28.979957 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-inventory" (OuterVolumeSpecName: "inventory") pod "29d6d8d9-57b8-4256-b831-940e5798ab5c" (UID: "29d6d8d9-57b8-4256-b831-940e5798ab5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.035622 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.035842 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwgv9\" (UniqueName: \"kubernetes.io/projected/29d6d8d9-57b8-4256-b831-940e5798ab5c-kube-api-access-bwgv9\") on node \"crc\" DevicePath \"\"" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.035919 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29d6d8d9-57b8-4256-b831-940e5798ab5c-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.435295 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" event={"ID":"29d6d8d9-57b8-4256-b831-940e5798ab5c","Type":"ContainerDied","Data":"2afc590401dda4aaf1d0760d38ece5d3197df2473086b4600765ecfd71f88afc"} Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.435644 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2afc590401dda4aaf1d0760d38ece5d3197df2473086b4600765ecfd71f88afc" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.435390 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-zvplg" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.511130 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-fst44"] Oct 03 11:50:29 crc kubenswrapper[4990]: E1003 11:50:29.511556 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerName="registry-server" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.511572 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerName="registry-server" Oct 03 11:50:29 crc kubenswrapper[4990]: E1003 11:50:29.511599 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d6d8d9-57b8-4256-b831-940e5798ab5c" containerName="download-cache-openstack-openstack-cell1" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.511606 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d6d8d9-57b8-4256-b831-940e5798ab5c" containerName="download-cache-openstack-openstack-cell1" Oct 03 11:50:29 crc kubenswrapper[4990]: E1003 11:50:29.511618 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerName="registry-server" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.511625 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerName="registry-server" Oct 03 11:50:29 crc kubenswrapper[4990]: E1003 11:50:29.511639 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerName="extract-content" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.511645 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerName="extract-content" Oct 03 11:50:29 crc kubenswrapper[4990]: E1003 11:50:29.511660 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerName="extract-utilities" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.511667 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerName="extract-utilities" Oct 03 11:50:29 crc kubenswrapper[4990]: E1003 11:50:29.511679 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerName="extract-content" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.511685 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerName="extract-content" Oct 03 11:50:29 crc kubenswrapper[4990]: E1003 11:50:29.511699 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerName="extract-utilities" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.511705 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerName="extract-utilities" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.512586 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d6d8d9-57b8-4256-b831-940e5798ab5c" containerName="download-cache-openstack-openstack-cell1" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.512627 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="82eb5b0d-37f0-4d07-9697-d097551c5a13" containerName="registry-server" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.512643 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56bace2-e3b8-41c3-8296-d21a4ae58be8" containerName="registry-server" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.513649 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.516071 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.517294 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.530370 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.531458 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.546479 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-fst44"] Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.649036 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-inventory\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.649207 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-ssh-key\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.649435 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59j9m\" (UniqueName: \"kubernetes.io/projected/e826ef9d-06da-4c2c-a889-d7788e2ca694-kube-api-access-59j9m\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.754198 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59j9m\" (UniqueName: \"kubernetes.io/projected/e826ef9d-06da-4c2c-a889-d7788e2ca694-kube-api-access-59j9m\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.754316 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-inventory\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.754530 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-ssh-key\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.760116 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-ssh-key\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.760527 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-inventory\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.775156 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59j9m\" (UniqueName: \"kubernetes.io/projected/e826ef9d-06da-4c2c-a889-d7788e2ca694-kube-api-access-59j9m\") pod \"configure-network-openstack-openstack-cell1-fst44\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:29 crc kubenswrapper[4990]: I1003 11:50:29.846560 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:50:30 crc kubenswrapper[4990]: W1003 11:50:30.301336 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode826ef9d_06da_4c2c_a889_d7788e2ca694.slice/crio-73865b485c9559e367ca08cb5f1704f91d5c007a226d40b916443b37014adb6c WatchSource:0}: Error finding container 73865b485c9559e367ca08cb5f1704f91d5c007a226d40b916443b37014adb6c: Status 404 returned error can't find the container with id 73865b485c9559e367ca08cb5f1704f91d5c007a226d40b916443b37014adb6c Oct 03 11:50:30 crc kubenswrapper[4990]: I1003 11:50:30.306549 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-fst44"] Oct 03 11:50:30 crc kubenswrapper[4990]: I1003 11:50:30.444714 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-fst44" event={"ID":"e826ef9d-06da-4c2c-a889-d7788e2ca694","Type":"ContainerStarted","Data":"73865b485c9559e367ca08cb5f1704f91d5c007a226d40b916443b37014adb6c"} Oct 03 11:50:31 crc kubenswrapper[4990]: I1003 11:50:31.455951 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-fst44" event={"ID":"e826ef9d-06da-4c2c-a889-d7788e2ca694","Type":"ContainerStarted","Data":"c8fb07e06b906e6649acb9990bdb24c071c3bb86c6e80c8a7286ff185558802f"} Oct 03 11:50:31 crc kubenswrapper[4990]: I1003 11:50:31.485996 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-fst44" podStartSLOduration=2.048000054 podStartE2EDuration="2.485974674s" podCreationTimestamp="2025-10-03 11:50:29 +0000 UTC" firstStartedPulling="2025-10-03 11:50:30.304169731 +0000 UTC m=+7612.100801588" lastFinishedPulling="2025-10-03 11:50:30.742144351 +0000 UTC m=+7612.538776208" observedRunningTime="2025-10-03 11:50:31.474115157 +0000 UTC m=+7613.270747014" watchObservedRunningTime="2025-10-03 11:50:31.485974674 +0000 UTC m=+7613.282606531" Oct 03 11:51:48 crc kubenswrapper[4990]: I1003 11:51:48.271279 4990 generic.go:334] "Generic (PLEG): container finished" podID="e826ef9d-06da-4c2c-a889-d7788e2ca694" containerID="c8fb07e06b906e6649acb9990bdb24c071c3bb86c6e80c8a7286ff185558802f" exitCode=0 Oct 03 11:51:48 crc kubenswrapper[4990]: I1003 11:51:48.271355 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-fst44" event={"ID":"e826ef9d-06da-4c2c-a889-d7788e2ca694","Type":"ContainerDied","Data":"c8fb07e06b906e6649acb9990bdb24c071c3bb86c6e80c8a7286ff185558802f"} Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.710682 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.743202 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59j9m\" (UniqueName: \"kubernetes.io/projected/e826ef9d-06da-4c2c-a889-d7788e2ca694-kube-api-access-59j9m\") pod \"e826ef9d-06da-4c2c-a889-d7788e2ca694\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.743251 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-ssh-key\") pod \"e826ef9d-06da-4c2c-a889-d7788e2ca694\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.743311 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-inventory\") pod \"e826ef9d-06da-4c2c-a889-d7788e2ca694\" (UID: \"e826ef9d-06da-4c2c-a889-d7788e2ca694\") " Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.751909 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e826ef9d-06da-4c2c-a889-d7788e2ca694-kube-api-access-59j9m" (OuterVolumeSpecName: "kube-api-access-59j9m") pod "e826ef9d-06da-4c2c-a889-d7788e2ca694" (UID: "e826ef9d-06da-4c2c-a889-d7788e2ca694"). InnerVolumeSpecName "kube-api-access-59j9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.787337 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-inventory" (OuterVolumeSpecName: "inventory") pod "e826ef9d-06da-4c2c-a889-d7788e2ca694" (UID: "e826ef9d-06da-4c2c-a889-d7788e2ca694"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.788172 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e826ef9d-06da-4c2c-a889-d7788e2ca694" (UID: "e826ef9d-06da-4c2c-a889-d7788e2ca694"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.846204 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59j9m\" (UniqueName: \"kubernetes.io/projected/e826ef9d-06da-4c2c-a889-d7788e2ca694-kube-api-access-59j9m\") on node \"crc\" DevicePath \"\"" Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.846240 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:51:49 crc kubenswrapper[4990]: I1003 11:51:49.846253 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e826ef9d-06da-4c2c-a889-d7788e2ca694-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.293304 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-fst44" event={"ID":"e826ef9d-06da-4c2c-a889-d7788e2ca694","Type":"ContainerDied","Data":"73865b485c9559e367ca08cb5f1704f91d5c007a226d40b916443b37014adb6c"} Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.293833 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73865b485c9559e367ca08cb5f1704f91d5c007a226d40b916443b37014adb6c" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.293401 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-fst44" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.376633 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-gmcbb"] Oct 03 11:51:50 crc kubenswrapper[4990]: E1003 11:51:50.377060 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e826ef9d-06da-4c2c-a889-d7788e2ca694" containerName="configure-network-openstack-openstack-cell1" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.377080 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e826ef9d-06da-4c2c-a889-d7788e2ca694" containerName="configure-network-openstack-openstack-cell1" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.377307 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e826ef9d-06da-4c2c-a889-d7788e2ca694" containerName="configure-network-openstack-openstack-cell1" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.378106 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.380666 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.381010 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.381280 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.381580 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.393866 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-gmcbb"] Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.458999 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-inventory\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.459082 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-ssh-key\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.459701 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvcr2\" (UniqueName: \"kubernetes.io/projected/2503d26b-59e5-4e31-816f-89eb184fcb78-kube-api-access-zvcr2\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.563650 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvcr2\" (UniqueName: \"kubernetes.io/projected/2503d26b-59e5-4e31-816f-89eb184fcb78-kube-api-access-zvcr2\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.563844 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-inventory\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.563944 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-ssh-key\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.570434 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-ssh-key\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.571712 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-inventory\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.586808 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvcr2\" (UniqueName: \"kubernetes.io/projected/2503d26b-59e5-4e31-816f-89eb184fcb78-kube-api-access-zvcr2\") pod \"validate-network-openstack-openstack-cell1-gmcbb\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:50 crc kubenswrapper[4990]: I1003 11:51:50.698450 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:51 crc kubenswrapper[4990]: I1003 11:51:51.297826 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-gmcbb"] Oct 03 11:51:51 crc kubenswrapper[4990]: W1003 11:51:51.303492 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2503d26b_59e5_4e31_816f_89eb184fcb78.slice/crio-e4549658f173ca7897e112551087cdcbc066b9e3cce34a03f9006256f2c6b0a8 WatchSource:0}: Error finding container e4549658f173ca7897e112551087cdcbc066b9e3cce34a03f9006256f2c6b0a8: Status 404 returned error can't find the container with id e4549658f173ca7897e112551087cdcbc066b9e3cce34a03f9006256f2c6b0a8 Oct 03 11:51:51 crc kubenswrapper[4990]: I1003 11:51:51.310362 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 11:51:52 crc kubenswrapper[4990]: I1003 11:51:52.324645 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" event={"ID":"2503d26b-59e5-4e31-816f-89eb184fcb78","Type":"ContainerStarted","Data":"76845b2a61fef8d17317c0d0155f0fd7c1b09967c315a069f0a2f3c5f04b3e05"} Oct 03 11:51:52 crc kubenswrapper[4990]: I1003 11:51:52.325312 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" event={"ID":"2503d26b-59e5-4e31-816f-89eb184fcb78","Type":"ContainerStarted","Data":"e4549658f173ca7897e112551087cdcbc066b9e3cce34a03f9006256f2c6b0a8"} Oct 03 11:51:52 crc kubenswrapper[4990]: I1003 11:51:52.350685 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" podStartSLOduration=1.9783552599999998 podStartE2EDuration="2.350660411s" podCreationTimestamp="2025-10-03 11:51:50 +0000 UTC" firstStartedPulling="2025-10-03 11:51:51.310035801 +0000 UTC m=+7693.106667668" lastFinishedPulling="2025-10-03 11:51:51.682340972 +0000 UTC m=+7693.478972819" observedRunningTime="2025-10-03 11:51:52.345842187 +0000 UTC m=+7694.142474054" watchObservedRunningTime="2025-10-03 11:51:52.350660411 +0000 UTC m=+7694.147292288" Oct 03 11:51:57 crc kubenswrapper[4990]: I1003 11:51:57.377841 4990 generic.go:334] "Generic (PLEG): container finished" podID="2503d26b-59e5-4e31-816f-89eb184fcb78" containerID="76845b2a61fef8d17317c0d0155f0fd7c1b09967c315a069f0a2f3c5f04b3e05" exitCode=0 Oct 03 11:51:57 crc kubenswrapper[4990]: I1003 11:51:57.377921 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" event={"ID":"2503d26b-59e5-4e31-816f-89eb184fcb78","Type":"ContainerDied","Data":"76845b2a61fef8d17317c0d0155f0fd7c1b09967c315a069f0a2f3c5f04b3e05"} Oct 03 11:51:58 crc kubenswrapper[4990]: I1003 11:51:58.847489 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:58 crc kubenswrapper[4990]: I1003 11:51:58.970007 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvcr2\" (UniqueName: \"kubernetes.io/projected/2503d26b-59e5-4e31-816f-89eb184fcb78-kube-api-access-zvcr2\") pod \"2503d26b-59e5-4e31-816f-89eb184fcb78\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " Oct 03 11:51:58 crc kubenswrapper[4990]: I1003 11:51:58.970271 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-inventory\") pod \"2503d26b-59e5-4e31-816f-89eb184fcb78\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " Oct 03 11:51:58 crc kubenswrapper[4990]: I1003 11:51:58.970397 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-ssh-key\") pod \"2503d26b-59e5-4e31-816f-89eb184fcb78\" (UID: \"2503d26b-59e5-4e31-816f-89eb184fcb78\") " Oct 03 11:51:58 crc kubenswrapper[4990]: I1003 11:51:58.975649 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2503d26b-59e5-4e31-816f-89eb184fcb78-kube-api-access-zvcr2" (OuterVolumeSpecName: "kube-api-access-zvcr2") pod "2503d26b-59e5-4e31-816f-89eb184fcb78" (UID: "2503d26b-59e5-4e31-816f-89eb184fcb78"). InnerVolumeSpecName "kube-api-access-zvcr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.001240 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-inventory" (OuterVolumeSpecName: "inventory") pod "2503d26b-59e5-4e31-816f-89eb184fcb78" (UID: "2503d26b-59e5-4e31-816f-89eb184fcb78"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.002341 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2503d26b-59e5-4e31-816f-89eb184fcb78" (UID: "2503d26b-59e5-4e31-816f-89eb184fcb78"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.073120 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvcr2\" (UniqueName: \"kubernetes.io/projected/2503d26b-59e5-4e31-816f-89eb184fcb78-kube-api-access-zvcr2\") on node \"crc\" DevicePath \"\"" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.073152 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.073161 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2503d26b-59e5-4e31-816f-89eb184fcb78-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.402377 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" event={"ID":"2503d26b-59e5-4e31-816f-89eb184fcb78","Type":"ContainerDied","Data":"e4549658f173ca7897e112551087cdcbc066b9e3cce34a03f9006256f2c6b0a8"} Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.402429 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4549658f173ca7897e112551087cdcbc066b9e3cce34a03f9006256f2c6b0a8" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.402461 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-gmcbb" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.484977 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kppbz"] Oct 03 11:51:59 crc kubenswrapper[4990]: E1003 11:51:59.485603 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2503d26b-59e5-4e31-816f-89eb184fcb78" containerName="validate-network-openstack-openstack-cell1" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.485633 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2503d26b-59e5-4e31-816f-89eb184fcb78" containerName="validate-network-openstack-openstack-cell1" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.485938 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2503d26b-59e5-4e31-816f-89eb184fcb78" containerName="validate-network-openstack-openstack-cell1" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.489070 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.491363 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.491772 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.491871 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.493790 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.494950 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kppbz"] Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.687824 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghfx\" (UniqueName: \"kubernetes.io/projected/204687ab-1f96-4f28-aa69-baaab8fcb8df-kube-api-access-mghfx\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.688702 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-ssh-key\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.688948 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-inventory\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.791052 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-ssh-key\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.791244 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-inventory\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.791455 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mghfx\" (UniqueName: \"kubernetes.io/projected/204687ab-1f96-4f28-aa69-baaab8fcb8df-kube-api-access-mghfx\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.796273 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-inventory\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.797681 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-ssh-key\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.809815 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghfx\" (UniqueName: \"kubernetes.io/projected/204687ab-1f96-4f28-aa69-baaab8fcb8df-kube-api-access-mghfx\") pod \"install-os-openstack-openstack-cell1-kppbz\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:51:59 crc kubenswrapper[4990]: I1003 11:51:59.813643 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:52:00 crc kubenswrapper[4990]: I1003 11:52:00.412108 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-kppbz"] Oct 03 11:52:01 crc kubenswrapper[4990]: I1003 11:52:01.425145 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kppbz" event={"ID":"204687ab-1f96-4f28-aa69-baaab8fcb8df","Type":"ContainerStarted","Data":"fe347c87460906061e82b8ddbac025009a2edb028c718c00ccaec777bd0a4610"} Oct 03 11:52:01 crc kubenswrapper[4990]: I1003 11:52:01.425840 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kppbz" event={"ID":"204687ab-1f96-4f28-aa69-baaab8fcb8df","Type":"ContainerStarted","Data":"1117540d86dd24046d058a27df52d8fb709a528c0392a2e24754e203b2fd7e99"} Oct 03 11:52:01 crc kubenswrapper[4990]: I1003 11:52:01.460386 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-kppbz" podStartSLOduration=2.028366378 podStartE2EDuration="2.460361014s" podCreationTimestamp="2025-10-03 11:51:59 +0000 UTC" firstStartedPulling="2025-10-03 11:52:00.412694841 +0000 UTC m=+7702.209326738" lastFinishedPulling="2025-10-03 11:52:00.844689517 +0000 UTC m=+7702.641321374" observedRunningTime="2025-10-03 11:52:01.440312355 +0000 UTC m=+7703.236944212" watchObservedRunningTime="2025-10-03 11:52:01.460361014 +0000 UTC m=+7703.256992881" Oct 03 11:52:25 crc kubenswrapper[4990]: I1003 11:52:25.304701 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:52:25 crc kubenswrapper[4990]: I1003 11:52:25.305604 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:52:44 crc kubenswrapper[4990]: I1003 11:52:44.918381 4990 generic.go:334] "Generic (PLEG): container finished" podID="204687ab-1f96-4f28-aa69-baaab8fcb8df" containerID="fe347c87460906061e82b8ddbac025009a2edb028c718c00ccaec777bd0a4610" exitCode=0 Oct 03 11:52:44 crc kubenswrapper[4990]: I1003 11:52:44.918638 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kppbz" event={"ID":"204687ab-1f96-4f28-aa69-baaab8fcb8df","Type":"ContainerDied","Data":"fe347c87460906061e82b8ddbac025009a2edb028c718c00ccaec777bd0a4610"} Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.415569 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.540378 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-ssh-key\") pod \"204687ab-1f96-4f28-aa69-baaab8fcb8df\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.540553 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-inventory\") pod \"204687ab-1f96-4f28-aa69-baaab8fcb8df\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.540710 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mghfx\" (UniqueName: \"kubernetes.io/projected/204687ab-1f96-4f28-aa69-baaab8fcb8df-kube-api-access-mghfx\") pod \"204687ab-1f96-4f28-aa69-baaab8fcb8df\" (UID: \"204687ab-1f96-4f28-aa69-baaab8fcb8df\") " Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.545673 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204687ab-1f96-4f28-aa69-baaab8fcb8df-kube-api-access-mghfx" (OuterVolumeSpecName: "kube-api-access-mghfx") pod "204687ab-1f96-4f28-aa69-baaab8fcb8df" (UID: "204687ab-1f96-4f28-aa69-baaab8fcb8df"). InnerVolumeSpecName "kube-api-access-mghfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.571854 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-inventory" (OuterVolumeSpecName: "inventory") pod "204687ab-1f96-4f28-aa69-baaab8fcb8df" (UID: "204687ab-1f96-4f28-aa69-baaab8fcb8df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.573754 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "204687ab-1f96-4f28-aa69-baaab8fcb8df" (UID: "204687ab-1f96-4f28-aa69-baaab8fcb8df"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.643228 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.643252 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/204687ab-1f96-4f28-aa69-baaab8fcb8df-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.643263 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mghfx\" (UniqueName: \"kubernetes.io/projected/204687ab-1f96-4f28-aa69-baaab8fcb8df-kube-api-access-mghfx\") on node \"crc\" DevicePath \"\"" Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.950003 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-kppbz" event={"ID":"204687ab-1f96-4f28-aa69-baaab8fcb8df","Type":"ContainerDied","Data":"1117540d86dd24046d058a27df52d8fb709a528c0392a2e24754e203b2fd7e99"} Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.950385 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1117540d86dd24046d058a27df52d8fb709a528c0392a2e24754e203b2fd7e99" Oct 03 11:52:46 crc kubenswrapper[4990]: I1003 11:52:46.950081 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-kppbz" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.057736 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-pf4zt"] Oct 03 11:52:47 crc kubenswrapper[4990]: E1003 11:52:47.058329 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204687ab-1f96-4f28-aa69-baaab8fcb8df" containerName="install-os-openstack-openstack-cell1" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.058352 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="204687ab-1f96-4f28-aa69-baaab8fcb8df" containerName="install-os-openstack-openstack-cell1" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.058685 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="204687ab-1f96-4f28-aa69-baaab8fcb8df" containerName="install-os-openstack-openstack-cell1" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.059695 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.063138 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.063306 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.063373 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.075015 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-pf4zt"] Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.077966 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.159938 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-inventory\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.160060 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qqn2\" (UniqueName: \"kubernetes.io/projected/0347a2f6-e939-4c8c-bc86-45622a71f3d4-kube-api-access-7qqn2\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.160095 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-ssh-key\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.262201 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-inventory\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.262344 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qqn2\" (UniqueName: \"kubernetes.io/projected/0347a2f6-e939-4c8c-bc86-45622a71f3d4-kube-api-access-7qqn2\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.262383 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-ssh-key\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.269634 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-ssh-key\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.270900 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-inventory\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.288810 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qqn2\" (UniqueName: \"kubernetes.io/projected/0347a2f6-e939-4c8c-bc86-45622a71f3d4-kube-api-access-7qqn2\") pod \"configure-os-openstack-openstack-cell1-pf4zt\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:47 crc kubenswrapper[4990]: I1003 11:52:47.418224 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:52:48 crc kubenswrapper[4990]: I1003 11:52:48.032635 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-pf4zt"] Oct 03 11:52:48 crc kubenswrapper[4990]: I1003 11:52:48.978742 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" event={"ID":"0347a2f6-e939-4c8c-bc86-45622a71f3d4","Type":"ContainerStarted","Data":"49f7a41a65bce07ab477f681c448b88c87d83024065b44afa04290bffdd43f18"} Oct 03 11:52:48 crc kubenswrapper[4990]: I1003 11:52:48.979295 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" event={"ID":"0347a2f6-e939-4c8c-bc86-45622a71f3d4","Type":"ContainerStarted","Data":"f71597644b783060722ae19c3121b9339171cb42d007558ceb686fa6762ad60e"} Oct 03 11:52:49 crc kubenswrapper[4990]: I1003 11:52:49.012130 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" podStartSLOduration=1.5908168900000001 podStartE2EDuration="2.012095027s" podCreationTimestamp="2025-10-03 11:52:47 +0000 UTC" firstStartedPulling="2025-10-03 11:52:48.047079844 +0000 UTC m=+7749.843711701" lastFinishedPulling="2025-10-03 11:52:48.468357981 +0000 UTC m=+7750.264989838" observedRunningTime="2025-10-03 11:52:49.000750834 +0000 UTC m=+7750.797382701" watchObservedRunningTime="2025-10-03 11:52:49.012095027 +0000 UTC m=+7750.808726884" Oct 03 11:52:55 crc kubenswrapper[4990]: I1003 11:52:55.303647 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:52:55 crc kubenswrapper[4990]: I1003 11:52:55.304505 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:53:25 crc kubenswrapper[4990]: I1003 11:53:25.304625 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 11:53:25 crc kubenswrapper[4990]: I1003 11:53:25.305352 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 11:53:25 crc kubenswrapper[4990]: I1003 11:53:25.305438 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 11:53:25 crc kubenswrapper[4990]: I1003 11:53:25.306673 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 11:53:25 crc kubenswrapper[4990]: I1003 11:53:25.306771 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" gracePeriod=600 Oct 03 11:53:25 crc kubenswrapper[4990]: E1003 11:53:25.441111 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:53:26 crc kubenswrapper[4990]: I1003 11:53:26.394436 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" exitCode=0 Oct 03 11:53:26 crc kubenswrapper[4990]: I1003 11:53:26.394657 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb"} Oct 03 11:53:26 crc kubenswrapper[4990]: I1003 11:53:26.395643 4990 scope.go:117] "RemoveContainer" containerID="836bfd8bd420440c04f038b28dec8dead6c507dc06e80c463c36bc55b27c3c1d" Oct 03 11:53:26 crc kubenswrapper[4990]: I1003 11:53:26.397209 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:53:26 crc kubenswrapper[4990]: E1003 11:53:26.398038 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:53:33 crc kubenswrapper[4990]: I1003 11:53:33.478751 4990 generic.go:334] "Generic (PLEG): container finished" podID="0347a2f6-e939-4c8c-bc86-45622a71f3d4" containerID="49f7a41a65bce07ab477f681c448b88c87d83024065b44afa04290bffdd43f18" exitCode=0 Oct 03 11:53:33 crc kubenswrapper[4990]: I1003 11:53:33.479744 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" event={"ID":"0347a2f6-e939-4c8c-bc86-45622a71f3d4","Type":"ContainerDied","Data":"49f7a41a65bce07ab477f681c448b88c87d83024065b44afa04290bffdd43f18"} Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.034907 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.111241 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-ssh-key\") pod \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.111455 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-inventory\") pod \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.111592 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qqn2\" (UniqueName: \"kubernetes.io/projected/0347a2f6-e939-4c8c-bc86-45622a71f3d4-kube-api-access-7qqn2\") pod \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\" (UID: \"0347a2f6-e939-4c8c-bc86-45622a71f3d4\") " Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.119885 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0347a2f6-e939-4c8c-bc86-45622a71f3d4-kube-api-access-7qqn2" (OuterVolumeSpecName: "kube-api-access-7qqn2") pod "0347a2f6-e939-4c8c-bc86-45622a71f3d4" (UID: "0347a2f6-e939-4c8c-bc86-45622a71f3d4"). InnerVolumeSpecName "kube-api-access-7qqn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.156886 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0347a2f6-e939-4c8c-bc86-45622a71f3d4" (UID: "0347a2f6-e939-4c8c-bc86-45622a71f3d4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.172104 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-inventory" (OuterVolumeSpecName: "inventory") pod "0347a2f6-e939-4c8c-bc86-45622a71f3d4" (UID: "0347a2f6-e939-4c8c-bc86-45622a71f3d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.213721 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.213764 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0347a2f6-e939-4c8c-bc86-45622a71f3d4-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.213777 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qqn2\" (UniqueName: \"kubernetes.io/projected/0347a2f6-e939-4c8c-bc86-45622a71f3d4-kube-api-access-7qqn2\") on node \"crc\" DevicePath \"\"" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.538289 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" event={"ID":"0347a2f6-e939-4c8c-bc86-45622a71f3d4","Type":"ContainerDied","Data":"f71597644b783060722ae19c3121b9339171cb42d007558ceb686fa6762ad60e"} Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.538379 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f71597644b783060722ae19c3121b9339171cb42d007558ceb686fa6762ad60e" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.538540 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-pf4zt" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.626020 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-bczw5"] Oct 03 11:53:35 crc kubenswrapper[4990]: E1003 11:53:35.627771 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0347a2f6-e939-4c8c-bc86-45622a71f3d4" containerName="configure-os-openstack-openstack-cell1" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.628243 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0347a2f6-e939-4c8c-bc86-45622a71f3d4" containerName="configure-os-openstack-openstack-cell1" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.628756 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0347a2f6-e939-4c8c-bc86-45622a71f3d4" containerName="configure-os-openstack-openstack-cell1" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.630296 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.632341 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.632973 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.633247 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.633287 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.642042 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-bczw5"] Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.646531 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j9gf\" (UniqueName: \"kubernetes.io/projected/8ebf76a3-140d-477c-81e3-e623baa47115-kube-api-access-8j9gf\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.646661 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.646905 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-inventory-0\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.748912 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j9gf\" (UniqueName: \"kubernetes.io/projected/8ebf76a3-140d-477c-81e3-e623baa47115-kube-api-access-8j9gf\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.749132 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.750899 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-inventory-0\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.755691 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.756308 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-inventory-0\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.774336 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j9gf\" (UniqueName: \"kubernetes.io/projected/8ebf76a3-140d-477c-81e3-e623baa47115-kube-api-access-8j9gf\") pod \"ssh-known-hosts-openstack-bczw5\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:35 crc kubenswrapper[4990]: I1003 11:53:35.965437 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:36 crc kubenswrapper[4990]: I1003 11:53:36.565427 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-bczw5"] Oct 03 11:53:37 crc kubenswrapper[4990]: I1003 11:53:37.561323 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-bczw5" event={"ID":"8ebf76a3-140d-477c-81e3-e623baa47115","Type":"ContainerStarted","Data":"79016089da8b04c859d65dbede5fca6ad38c9ec638cc149e9e45f3452175c0bc"} Oct 03 11:53:37 crc kubenswrapper[4990]: I1003 11:53:37.562150 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-bczw5" event={"ID":"8ebf76a3-140d-477c-81e3-e623baa47115","Type":"ContainerStarted","Data":"8a15fc4143d0b5651fd15ca343788ab2779f62f32a1263658d92798bc3a2d4c6"} Oct 03 11:53:37 crc kubenswrapper[4990]: I1003 11:53:37.592202 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-bczw5" podStartSLOduration=1.990088528 podStartE2EDuration="2.592177314s" podCreationTimestamp="2025-10-03 11:53:35 +0000 UTC" firstStartedPulling="2025-10-03 11:53:36.567717331 +0000 UTC m=+7798.364349208" lastFinishedPulling="2025-10-03 11:53:37.169806127 +0000 UTC m=+7798.966437994" observedRunningTime="2025-10-03 11:53:37.587228966 +0000 UTC m=+7799.383860853" watchObservedRunningTime="2025-10-03 11:53:37.592177314 +0000 UTC m=+7799.388809191" Oct 03 11:53:37 crc kubenswrapper[4990]: I1003 11:53:37.872282 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:53:37 crc kubenswrapper[4990]: E1003 11:53:37.872675 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:53:46 crc kubenswrapper[4990]: I1003 11:53:46.659121 4990 generic.go:334] "Generic (PLEG): container finished" podID="8ebf76a3-140d-477c-81e3-e623baa47115" containerID="79016089da8b04c859d65dbede5fca6ad38c9ec638cc149e9e45f3452175c0bc" exitCode=0 Oct 03 11:53:46 crc kubenswrapper[4990]: I1003 11:53:46.659249 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-bczw5" event={"ID":"8ebf76a3-140d-477c-81e3-e623baa47115","Type":"ContainerDied","Data":"79016089da8b04c859d65dbede5fca6ad38c9ec638cc149e9e45f3452175c0bc"} Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.179233 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.338012 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-inventory-0\") pod \"8ebf76a3-140d-477c-81e3-e623baa47115\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.338654 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-ssh-key-openstack-cell1\") pod \"8ebf76a3-140d-477c-81e3-e623baa47115\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.338803 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j9gf\" (UniqueName: \"kubernetes.io/projected/8ebf76a3-140d-477c-81e3-e623baa47115-kube-api-access-8j9gf\") pod \"8ebf76a3-140d-477c-81e3-e623baa47115\" (UID: \"8ebf76a3-140d-477c-81e3-e623baa47115\") " Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.347742 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ebf76a3-140d-477c-81e3-e623baa47115-kube-api-access-8j9gf" (OuterVolumeSpecName: "kube-api-access-8j9gf") pod "8ebf76a3-140d-477c-81e3-e623baa47115" (UID: "8ebf76a3-140d-477c-81e3-e623baa47115"). InnerVolumeSpecName "kube-api-access-8j9gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.371168 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8ebf76a3-140d-477c-81e3-e623baa47115" (UID: "8ebf76a3-140d-477c-81e3-e623baa47115"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.373121 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8ebf76a3-140d-477c-81e3-e623baa47115" (UID: "8ebf76a3-140d-477c-81e3-e623baa47115"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.441055 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.441098 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j9gf\" (UniqueName: \"kubernetes.io/projected/8ebf76a3-140d-477c-81e3-e623baa47115-kube-api-access-8j9gf\") on node \"crc\" DevicePath \"\"" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.441113 4990 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8ebf76a3-140d-477c-81e3-e623baa47115-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.690994 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-bczw5" event={"ID":"8ebf76a3-140d-477c-81e3-e623baa47115","Type":"ContainerDied","Data":"8a15fc4143d0b5651fd15ca343788ab2779f62f32a1263658d92798bc3a2d4c6"} Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.691036 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a15fc4143d0b5651fd15ca343788ab2779f62f32a1263658d92798bc3a2d4c6" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.691107 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-bczw5" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.752025 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-w5tgt"] Oct 03 11:53:48 crc kubenswrapper[4990]: E1003 11:53:48.752444 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebf76a3-140d-477c-81e3-e623baa47115" containerName="ssh-known-hosts-openstack" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.752455 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebf76a3-140d-477c-81e3-e623baa47115" containerName="ssh-known-hosts-openstack" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.760906 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebf76a3-140d-477c-81e3-e623baa47115" containerName="ssh-known-hosts-openstack" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.761912 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.764065 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.764214 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.766237 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.766444 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.766551 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-w5tgt"] Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.850305 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-ssh-key\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.850359 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpwzt\" (UniqueName: \"kubernetes.io/projected/52183779-8954-4290-9dad-d7135f152399-kube-api-access-cpwzt\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.850565 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-inventory\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.953733 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-inventory\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.954102 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-ssh-key\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.954219 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpwzt\" (UniqueName: \"kubernetes.io/projected/52183779-8954-4290-9dad-d7135f152399-kube-api-access-cpwzt\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.957980 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-inventory\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.958758 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-ssh-key\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:48 crc kubenswrapper[4990]: I1003 11:53:48.974079 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpwzt\" (UniqueName: \"kubernetes.io/projected/52183779-8954-4290-9dad-d7135f152399-kube-api-access-cpwzt\") pod \"run-os-openstack-openstack-cell1-w5tgt\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:49 crc kubenswrapper[4990]: I1003 11:53:49.092676 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:53:49 crc kubenswrapper[4990]: I1003 11:53:49.635816 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-w5tgt"] Oct 03 11:53:49 crc kubenswrapper[4990]: W1003 11:53:49.641344 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52183779_8954_4290_9dad_d7135f152399.slice/crio-fc0967976385b2a4b6dfed4ef827cd416df212471c32a82d7a23b49eb4af750e WatchSource:0}: Error finding container fc0967976385b2a4b6dfed4ef827cd416df212471c32a82d7a23b49eb4af750e: Status 404 returned error can't find the container with id fc0967976385b2a4b6dfed4ef827cd416df212471c32a82d7a23b49eb4af750e Oct 03 11:53:49 crc kubenswrapper[4990]: I1003 11:53:49.703794 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" event={"ID":"52183779-8954-4290-9dad-d7135f152399","Type":"ContainerStarted","Data":"fc0967976385b2a4b6dfed4ef827cd416df212471c32a82d7a23b49eb4af750e"} Oct 03 11:53:49 crc kubenswrapper[4990]: I1003 11:53:49.872185 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:53:49 crc kubenswrapper[4990]: E1003 11:53:49.872463 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:53:51 crc kubenswrapper[4990]: I1003 11:53:51.729717 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" event={"ID":"52183779-8954-4290-9dad-d7135f152399","Type":"ContainerStarted","Data":"7862ac70f17bd379b65287ba99369d54b3dd6285495763ef1f0b25d1988f1e22"} Oct 03 11:53:51 crc kubenswrapper[4990]: I1003 11:53:51.758911 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" podStartSLOduration=2.568608305 podStartE2EDuration="3.758891118s" podCreationTimestamp="2025-10-03 11:53:48 +0000 UTC" firstStartedPulling="2025-10-03 11:53:49.644586911 +0000 UTC m=+7811.441218758" lastFinishedPulling="2025-10-03 11:53:50.834869694 +0000 UTC m=+7812.631501571" observedRunningTime="2025-10-03 11:53:51.752274517 +0000 UTC m=+7813.548906414" watchObservedRunningTime="2025-10-03 11:53:51.758891118 +0000 UTC m=+7813.555522975" Oct 03 11:53:59 crc kubenswrapper[4990]: I1003 11:53:59.827439 4990 generic.go:334] "Generic (PLEG): container finished" podID="52183779-8954-4290-9dad-d7135f152399" containerID="7862ac70f17bd379b65287ba99369d54b3dd6285495763ef1f0b25d1988f1e22" exitCode=0 Oct 03 11:53:59 crc kubenswrapper[4990]: I1003 11:53:59.827560 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" event={"ID":"52183779-8954-4290-9dad-d7135f152399","Type":"ContainerDied","Data":"7862ac70f17bd379b65287ba99369d54b3dd6285495763ef1f0b25d1988f1e22"} Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.320453 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.474973 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpwzt\" (UniqueName: \"kubernetes.io/projected/52183779-8954-4290-9dad-d7135f152399-kube-api-access-cpwzt\") pod \"52183779-8954-4290-9dad-d7135f152399\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.475119 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-ssh-key\") pod \"52183779-8954-4290-9dad-d7135f152399\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.475211 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-inventory\") pod \"52183779-8954-4290-9dad-d7135f152399\" (UID: \"52183779-8954-4290-9dad-d7135f152399\") " Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.500743 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52183779-8954-4290-9dad-d7135f152399-kube-api-access-cpwzt" (OuterVolumeSpecName: "kube-api-access-cpwzt") pod "52183779-8954-4290-9dad-d7135f152399" (UID: "52183779-8954-4290-9dad-d7135f152399"). InnerVolumeSpecName "kube-api-access-cpwzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.509693 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-inventory" (OuterVolumeSpecName: "inventory") pod "52183779-8954-4290-9dad-d7135f152399" (UID: "52183779-8954-4290-9dad-d7135f152399"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.530819 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "52183779-8954-4290-9dad-d7135f152399" (UID: "52183779-8954-4290-9dad-d7135f152399"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.577213 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.577280 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpwzt\" (UniqueName: \"kubernetes.io/projected/52183779-8954-4290-9dad-d7135f152399-kube-api-access-cpwzt\") on node \"crc\" DevicePath \"\"" Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.577297 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52183779-8954-4290-9dad-d7135f152399-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.850716 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" event={"ID":"52183779-8954-4290-9dad-d7135f152399","Type":"ContainerDied","Data":"fc0967976385b2a4b6dfed4ef827cd416df212471c32a82d7a23b49eb4af750e"} Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.851058 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc0967976385b2a4b6dfed4ef827cd416df212471c32a82d7a23b49eb4af750e" Oct 03 11:54:01 crc kubenswrapper[4990]: I1003 11:54:01.850829 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-w5tgt" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.016206 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-kgw28"] Oct 03 11:54:02 crc kubenswrapper[4990]: E1003 11:54:02.016674 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52183779-8954-4290-9dad-d7135f152399" containerName="run-os-openstack-openstack-cell1" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.016705 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="52183779-8954-4290-9dad-d7135f152399" containerName="run-os-openstack-openstack-cell1" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.016993 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="52183779-8954-4290-9dad-d7135f152399" containerName="run-os-openstack-openstack-cell1" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.018005 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.020093 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.020606 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.022038 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.030953 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-kgw28"] Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.033614 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.188326 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-inventory\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.188671 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zch8v\" (UniqueName: \"kubernetes.io/projected/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-kube-api-access-zch8v\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.188871 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.291228 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zch8v\" (UniqueName: \"kubernetes.io/projected/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-kube-api-access-zch8v\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.291396 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.291662 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-inventory\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.302584 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.308165 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-inventory\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.312018 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zch8v\" (UniqueName: \"kubernetes.io/projected/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-kube-api-access-zch8v\") pod \"reboot-os-openstack-openstack-cell1-kgw28\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:02 crc kubenswrapper[4990]: I1003 11:54:02.355600 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:03 crc kubenswrapper[4990]: I1003 11:54:03.014632 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-kgw28"] Oct 03 11:54:03 crc kubenswrapper[4990]: W1003 11:54:03.017358 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9b036e9_0a70_4dbf_abc9_9a366d7e1126.slice/crio-aace25a9619ed22470fa55e624ef7141e89113c617e6e5579413b095df8d6bb1 WatchSource:0}: Error finding container aace25a9619ed22470fa55e624ef7141e89113c617e6e5579413b095df8d6bb1: Status 404 returned error can't find the container with id aace25a9619ed22470fa55e624ef7141e89113c617e6e5579413b095df8d6bb1 Oct 03 11:54:03 crc kubenswrapper[4990]: I1003 11:54:03.876286 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" event={"ID":"e9b036e9-0a70-4dbf-abc9-9a366d7e1126","Type":"ContainerStarted","Data":"c22b253691e55a7ce778dc527bb50395ce3b57232753e2c79976d7fa5441df11"} Oct 03 11:54:03 crc kubenswrapper[4990]: I1003 11:54:03.876656 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" event={"ID":"e9b036e9-0a70-4dbf-abc9-9a366d7e1126","Type":"ContainerStarted","Data":"aace25a9619ed22470fa55e624ef7141e89113c617e6e5579413b095df8d6bb1"} Oct 03 11:54:03 crc kubenswrapper[4990]: I1003 11:54:03.901832 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" podStartSLOduration=2.448707876 podStartE2EDuration="2.901811017s" podCreationTimestamp="2025-10-03 11:54:01 +0000 UTC" firstStartedPulling="2025-10-03 11:54:03.020878198 +0000 UTC m=+7824.817510085" lastFinishedPulling="2025-10-03 11:54:03.473981349 +0000 UTC m=+7825.270613226" observedRunningTime="2025-10-03 11:54:03.900665487 +0000 UTC m=+7825.697297374" watchObservedRunningTime="2025-10-03 11:54:03.901811017 +0000 UTC m=+7825.698442894" Oct 03 11:54:04 crc kubenswrapper[4990]: I1003 11:54:04.871340 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:54:04 crc kubenswrapper[4990]: E1003 11:54:04.871853 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:54:19 crc kubenswrapper[4990]: I1003 11:54:19.872825 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:54:19 crc kubenswrapper[4990]: E1003 11:54:19.873690 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:54:20 crc kubenswrapper[4990]: I1003 11:54:20.081093 4990 generic.go:334] "Generic (PLEG): container finished" podID="e9b036e9-0a70-4dbf-abc9-9a366d7e1126" containerID="c22b253691e55a7ce778dc527bb50395ce3b57232753e2c79976d7fa5441df11" exitCode=0 Oct 03 11:54:20 crc kubenswrapper[4990]: I1003 11:54:20.081168 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" event={"ID":"e9b036e9-0a70-4dbf-abc9-9a366d7e1126","Type":"ContainerDied","Data":"c22b253691e55a7ce778dc527bb50395ce3b57232753e2c79976d7fa5441df11"} Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.580462 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.673245 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-inventory\") pod \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.673311 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zch8v\" (UniqueName: \"kubernetes.io/projected/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-kube-api-access-zch8v\") pod \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.673347 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-ssh-key\") pod \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\" (UID: \"e9b036e9-0a70-4dbf-abc9-9a366d7e1126\") " Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.685075 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-kube-api-access-zch8v" (OuterVolumeSpecName: "kube-api-access-zch8v") pod "e9b036e9-0a70-4dbf-abc9-9a366d7e1126" (UID: "e9b036e9-0a70-4dbf-abc9-9a366d7e1126"). InnerVolumeSpecName "kube-api-access-zch8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.722108 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-inventory" (OuterVolumeSpecName: "inventory") pod "e9b036e9-0a70-4dbf-abc9-9a366d7e1126" (UID: "e9b036e9-0a70-4dbf-abc9-9a366d7e1126"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.728674 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e9b036e9-0a70-4dbf-abc9-9a366d7e1126" (UID: "e9b036e9-0a70-4dbf-abc9-9a366d7e1126"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.777194 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.777607 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zch8v\" (UniqueName: \"kubernetes.io/projected/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-kube-api-access-zch8v\") on node \"crc\" DevicePath \"\"" Oct 03 11:54:21 crc kubenswrapper[4990]: I1003 11:54:21.777680 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9b036e9-0a70-4dbf-abc9-9a366d7e1126-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.110554 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" event={"ID":"e9b036e9-0a70-4dbf-abc9-9a366d7e1126","Type":"ContainerDied","Data":"aace25a9619ed22470fa55e624ef7141e89113c617e6e5579413b095df8d6bb1"} Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.110993 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aace25a9619ed22470fa55e624ef7141e89113c617e6e5579413b095df8d6bb1" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.110606 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-kgw28" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.202220 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-q4nck"] Oct 03 11:54:22 crc kubenswrapper[4990]: E1003 11:54:22.203094 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b036e9-0a70-4dbf-abc9-9a366d7e1126" containerName="reboot-os-openstack-openstack-cell1" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.203222 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b036e9-0a70-4dbf-abc9-9a366d7e1126" containerName="reboot-os-openstack-openstack-cell1" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.203557 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b036e9-0a70-4dbf-abc9-9a366d7e1126" containerName="reboot-os-openstack-openstack-cell1" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.204696 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.209171 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.209422 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.209752 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.209924 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.210143 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.210537 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.210712 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.216961 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.234302 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-q4nck"] Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290129 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290199 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-inventory\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290225 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290382 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ssh-key\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290454 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290614 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290658 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290771 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd8sj\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-kube-api-access-xd8sj\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290802 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290864 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.290924 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.291095 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.291230 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.291409 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.291487 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393161 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393262 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd8sj\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-kube-api-access-xd8sj\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393287 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393350 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393406 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393484 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393540 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393644 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393665 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393726 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393764 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-inventory\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393786 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393855 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ssh-key\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393881 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.393982 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.398053 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.398350 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.398395 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.398625 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.399421 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.400834 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.400977 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ssh-key\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.401601 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.403249 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.403467 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-inventory\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.403664 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.404770 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.407376 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.409291 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.411628 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd8sj\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-kube-api-access-xd8sj\") pod \"install-certs-openstack-openstack-cell1-q4nck\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:22 crc kubenswrapper[4990]: I1003 11:54:22.528289 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:54:23 crc kubenswrapper[4990]: I1003 11:54:23.077278 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-q4nck"] Oct 03 11:54:23 crc kubenswrapper[4990]: I1003 11:54:23.120549 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" event={"ID":"abbd0668-e891-45f4-b884-8876d504f476","Type":"ContainerStarted","Data":"2976805c27e98e9f2d7708d554e324e3492682360cb53fbd83bae193d6b99bbb"} Oct 03 11:54:25 crc kubenswrapper[4990]: I1003 11:54:25.154913 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" event={"ID":"abbd0668-e891-45f4-b884-8876d504f476","Type":"ContainerStarted","Data":"ec0cbb0715283a01a89444d103d8701a40b375b42dca9eb1da870f399e21c124"} Oct 03 11:54:25 crc kubenswrapper[4990]: I1003 11:54:25.180751 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" podStartSLOduration=2.378949157 podStartE2EDuration="3.180725939s" podCreationTimestamp="2025-10-03 11:54:22 +0000 UTC" firstStartedPulling="2025-10-03 11:54:23.083123396 +0000 UTC m=+7844.879755253" lastFinishedPulling="2025-10-03 11:54:23.884900178 +0000 UTC m=+7845.681532035" observedRunningTime="2025-10-03 11:54:25.174252592 +0000 UTC m=+7846.970884469" watchObservedRunningTime="2025-10-03 11:54:25.180725939 +0000 UTC m=+7846.977357806" Oct 03 11:54:30 crc kubenswrapper[4990]: I1003 11:54:30.872556 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:54:30 crc kubenswrapper[4990]: E1003 11:54:30.873536 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:54:41 crc kubenswrapper[4990]: I1003 11:54:41.872383 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:54:41 crc kubenswrapper[4990]: E1003 11:54:41.873264 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:54:53 crc kubenswrapper[4990]: I1003 11:54:53.872542 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:54:53 crc kubenswrapper[4990]: E1003 11:54:53.873610 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:55:02 crc kubenswrapper[4990]: I1003 11:55:02.528888 4990 generic.go:334] "Generic (PLEG): container finished" podID="abbd0668-e891-45f4-b884-8876d504f476" containerID="ec0cbb0715283a01a89444d103d8701a40b375b42dca9eb1da870f399e21c124" exitCode=0 Oct 03 11:55:02 crc kubenswrapper[4990]: I1003 11:55:02.529003 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" event={"ID":"abbd0668-e891-45f4-b884-8876d504f476","Type":"ContainerDied","Data":"ec0cbb0715283a01a89444d103d8701a40b375b42dca9eb1da870f399e21c124"} Oct 03 11:55:03 crc kubenswrapper[4990]: I1003 11:55:03.953967 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.059463 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-libvirt-default-certs-0\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.059587 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ssh-key\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.059633 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-ovn-default-certs-0\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.059662 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-telemetry-combined-ca-bundle\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.059689 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-telemetry-default-certs-0\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.059742 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-inventory\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.059763 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ovn-combined-ca-bundle\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.059844 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-neutron-metadata-default-certs-0\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.060396 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-metadata-combined-ca-bundle\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.060433 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-bootstrap-combined-ca-bundle\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.060480 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-libvirt-combined-ca-bundle\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.060922 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd8sj\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-kube-api-access-xd8sj\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.060957 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-dhcp-combined-ca-bundle\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.060992 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-nova-combined-ca-bundle\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.061090 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-sriov-combined-ca-bundle\") pod \"abbd0668-e891-45f4-b884-8876d504f476\" (UID: \"abbd0668-e891-45f4-b884-8876d504f476\") " Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.066657 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.072408 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.073277 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.073450 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.073447 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.073498 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-kube-api-access-xd8sj" (OuterVolumeSpecName: "kube-api-access-xd8sj") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "kube-api-access-xd8sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.073567 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.073742 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.075265 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.075760 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.076363 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.076707 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.076729 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.100873 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.104654 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-inventory" (OuterVolumeSpecName: "inventory") pod "abbd0668-e891-45f4-b884-8876d504f476" (UID: "abbd0668-e891-45f4-b884-8876d504f476"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164633 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164683 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164700 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164718 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164731 4990 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164744 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164758 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164771 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164783 4990 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164793 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164805 4990 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164817 4990 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164831 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd8sj\" (UniqueName: \"kubernetes.io/projected/abbd0668-e891-45f4-b884-8876d504f476-kube-api-access-xd8sj\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164842 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.164876 4990 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abbd0668-e891-45f4-b884-8876d504f476-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.546633 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" event={"ID":"abbd0668-e891-45f4-b884-8876d504f476","Type":"ContainerDied","Data":"2976805c27e98e9f2d7708d554e324e3492682360cb53fbd83bae193d6b99bbb"} Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.546678 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2976805c27e98e9f2d7708d554e324e3492682360cb53fbd83bae193d6b99bbb" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.546735 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-q4nck" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.675527 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-dhz6j"] Oct 03 11:55:04 crc kubenswrapper[4990]: E1003 11:55:04.676268 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abbd0668-e891-45f4-b884-8876d504f476" containerName="install-certs-openstack-openstack-cell1" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.676298 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbd0668-e891-45f4-b884-8876d504f476" containerName="install-certs-openstack-openstack-cell1" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.676698 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="abbd0668-e891-45f4-b884-8876d504f476" containerName="install-certs-openstack-openstack-cell1" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.677757 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.680155 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.681611 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.681626 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.682289 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.682292 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.686336 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-dhz6j"] Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.784465 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2dph\" (UniqueName: \"kubernetes.io/projected/ac3694d2-235d-4f91-a2ac-eb6b627112b3-kube-api-access-h2dph\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.784545 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.784605 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ssh-key\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.784734 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-inventory\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.785003 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.872625 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:55:04 crc kubenswrapper[4990]: E1003 11:55:04.872959 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.887600 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ssh-key\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.887679 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-inventory\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.887769 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.887831 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2dph\" (UniqueName: \"kubernetes.io/projected/ac3694d2-235d-4f91-a2ac-eb6b627112b3-kube-api-access-h2dph\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.887862 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.888964 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.895234 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ssh-key\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.895429 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.896700 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-inventory\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:04 crc kubenswrapper[4990]: I1003 11:55:04.909733 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2dph\" (UniqueName: \"kubernetes.io/projected/ac3694d2-235d-4f91-a2ac-eb6b627112b3-kube-api-access-h2dph\") pod \"ovn-openstack-openstack-cell1-dhz6j\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:05 crc kubenswrapper[4990]: I1003 11:55:05.002726 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:55:05 crc kubenswrapper[4990]: I1003 11:55:05.562043 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-dhz6j"] Oct 03 11:55:05 crc kubenswrapper[4990]: W1003 11:55:05.572131 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac3694d2_235d_4f91_a2ac_eb6b627112b3.slice/crio-1aad786d3c70df34b46eac08936de42c7fc3cfb24bd1d707f26deacca1890b17 WatchSource:0}: Error finding container 1aad786d3c70df34b46eac08936de42c7fc3cfb24bd1d707f26deacca1890b17: Status 404 returned error can't find the container with id 1aad786d3c70df34b46eac08936de42c7fc3cfb24bd1d707f26deacca1890b17 Oct 03 11:55:06 crc kubenswrapper[4990]: I1003 11:55:06.570689 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" event={"ID":"ac3694d2-235d-4f91-a2ac-eb6b627112b3","Type":"ContainerStarted","Data":"8c64b23870882086263c9c64af1ec2153b3045f8a7fa2029fa550c5f7a4cd05b"} Oct 03 11:55:06 crc kubenswrapper[4990]: I1003 11:55:06.570959 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" event={"ID":"ac3694d2-235d-4f91-a2ac-eb6b627112b3","Type":"ContainerStarted","Data":"1aad786d3c70df34b46eac08936de42c7fc3cfb24bd1d707f26deacca1890b17"} Oct 03 11:55:06 crc kubenswrapper[4990]: I1003 11:55:06.594905 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" podStartSLOduration=2.123290517 podStartE2EDuration="2.594881057s" podCreationTimestamp="2025-10-03 11:55:04 +0000 UTC" firstStartedPulling="2025-10-03 11:55:05.575036404 +0000 UTC m=+7887.371668261" lastFinishedPulling="2025-10-03 11:55:06.046626934 +0000 UTC m=+7887.843258801" observedRunningTime="2025-10-03 11:55:06.589738664 +0000 UTC m=+7888.386370541" watchObservedRunningTime="2025-10-03 11:55:06.594881057 +0000 UTC m=+7888.391512924" Oct 03 11:55:15 crc kubenswrapper[4990]: I1003 11:55:15.873401 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:55:15 crc kubenswrapper[4990]: E1003 11:55:15.874337 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:55:28 crc kubenswrapper[4990]: I1003 11:55:28.885036 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:55:28 crc kubenswrapper[4990]: E1003 11:55:28.885906 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:55:40 crc kubenswrapper[4990]: I1003 11:55:40.872736 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:55:40 crc kubenswrapper[4990]: E1003 11:55:40.873539 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:55:54 crc kubenswrapper[4990]: I1003 11:55:54.871504 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:55:54 crc kubenswrapper[4990]: E1003 11:55:54.872539 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:56:07 crc kubenswrapper[4990]: I1003 11:56:07.872630 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:56:07 crc kubenswrapper[4990]: E1003 11:56:07.873794 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:56:13 crc kubenswrapper[4990]: I1003 11:56:13.318109 4990 generic.go:334] "Generic (PLEG): container finished" podID="ac3694d2-235d-4f91-a2ac-eb6b627112b3" containerID="8c64b23870882086263c9c64af1ec2153b3045f8a7fa2029fa550c5f7a4cd05b" exitCode=0 Oct 03 11:56:13 crc kubenswrapper[4990]: I1003 11:56:13.318213 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" event={"ID":"ac3694d2-235d-4f91-a2ac-eb6b627112b3","Type":"ContainerDied","Data":"8c64b23870882086263c9c64af1ec2153b3045f8a7fa2029fa550c5f7a4cd05b"} Oct 03 11:56:14 crc kubenswrapper[4990]: I1003 11:56:14.899192 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:56:14 crc kubenswrapper[4990]: I1003 11:56:14.970024 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ssh-key\") pod \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " Oct 03 11:56:14 crc kubenswrapper[4990]: I1003 11:56:14.970601 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0\") pod \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " Oct 03 11:56:14 crc kubenswrapper[4990]: I1003 11:56:14.970673 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovn-combined-ca-bundle\") pod \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " Oct 03 11:56:14 crc kubenswrapper[4990]: I1003 11:56:14.970734 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-inventory\") pod \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " Oct 03 11:56:14 crc kubenswrapper[4990]: I1003 11:56:14.970856 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2dph\" (UniqueName: \"kubernetes.io/projected/ac3694d2-235d-4f91-a2ac-eb6b627112b3-kube-api-access-h2dph\") pod \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " Oct 03 11:56:14 crc kubenswrapper[4990]: I1003 11:56:14.978362 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3694d2-235d-4f91-a2ac-eb6b627112b3-kube-api-access-h2dph" (OuterVolumeSpecName: "kube-api-access-h2dph") pod "ac3694d2-235d-4f91-a2ac-eb6b627112b3" (UID: "ac3694d2-235d-4f91-a2ac-eb6b627112b3"). InnerVolumeSpecName "kube-api-access-h2dph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:56:14 crc kubenswrapper[4990]: I1003 11:56:14.978654 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ac3694d2-235d-4f91-a2ac-eb6b627112b3" (UID: "ac3694d2-235d-4f91-a2ac-eb6b627112b3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.001894 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-inventory" (OuterVolumeSpecName: "inventory") pod "ac3694d2-235d-4f91-a2ac-eb6b627112b3" (UID: "ac3694d2-235d-4f91-a2ac-eb6b627112b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:56:15 crc kubenswrapper[4990]: E1003 11:56:15.006692 4990 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0 podName:ac3694d2-235d-4f91-a2ac-eb6b627112b3 nodeName:}" failed. No retries permitted until 2025-10-03 11:56:15.50666158 +0000 UTC m=+7957.303293437 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovncontroller-config-0" (UniqueName: "kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0") pod "ac3694d2-235d-4f91-a2ac-eb6b627112b3" (UID: "ac3694d2-235d-4f91-a2ac-eb6b627112b3") : error deleting /var/lib/kubelet/pods/ac3694d2-235d-4f91-a2ac-eb6b627112b3/volume-subpaths: remove /var/lib/kubelet/pods/ac3694d2-235d-4f91-a2ac-eb6b627112b3/volume-subpaths: no such file or directory Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.012550 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ac3694d2-235d-4f91-a2ac-eb6b627112b3" (UID: "ac3694d2-235d-4f91-a2ac-eb6b627112b3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.073587 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.073651 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.073667 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac3694d2-235d-4f91-a2ac-eb6b627112b3-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.073699 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2dph\" (UniqueName: \"kubernetes.io/projected/ac3694d2-235d-4f91-a2ac-eb6b627112b3-kube-api-access-h2dph\") on node \"crc\" DevicePath \"\"" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.345547 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" event={"ID":"ac3694d2-235d-4f91-a2ac-eb6b627112b3","Type":"ContainerDied","Data":"1aad786d3c70df34b46eac08936de42c7fc3cfb24bd1d707f26deacca1890b17"} Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.345648 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aad786d3c70df34b46eac08936de42c7fc3cfb24bd1d707f26deacca1890b17" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.345675 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-dhz6j" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.448686 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-dj8b7"] Oct 03 11:56:15 crc kubenswrapper[4990]: E1003 11:56:15.449227 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3694d2-235d-4f91-a2ac-eb6b627112b3" containerName="ovn-openstack-openstack-cell1" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.449259 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3694d2-235d-4f91-a2ac-eb6b627112b3" containerName="ovn-openstack-openstack-cell1" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.449483 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3694d2-235d-4f91-a2ac-eb6b627112b3" containerName="ovn-openstack-openstack-cell1" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.450384 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.452530 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.453416 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.466702 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-dj8b7"] Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.584490 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0\") pod \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\" (UID: \"ac3694d2-235d-4f91-a2ac-eb6b627112b3\") " Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.584956 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.585014 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.585041 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.585093 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.585129 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2dgk\" (UniqueName: \"kubernetes.io/projected/dd955a7e-39fc-4115-94d4-f578031c03ca-kube-api-access-w2dgk\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.585183 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.585725 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ac3694d2-235d-4f91-a2ac-eb6b627112b3" (UID: "ac3694d2-235d-4f91-a2ac-eb6b627112b3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.687471 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2dgk\" (UniqueName: \"kubernetes.io/projected/dd955a7e-39fc-4115-94d4-f578031c03ca-kube-api-access-w2dgk\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.687868 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.688148 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.688366 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.688662 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.688883 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.689088 4990 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ac3694d2-235d-4f91-a2ac-eb6b627112b3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.694666 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.694780 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.696120 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.707444 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.708353 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.709750 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2dgk\" (UniqueName: \"kubernetes.io/projected/dd955a7e-39fc-4115-94d4-f578031c03ca-kube-api-access-w2dgk\") pod \"neutron-metadata-openstack-openstack-cell1-dj8b7\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:15 crc kubenswrapper[4990]: I1003 11:56:15.785985 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:56:16 crc kubenswrapper[4990]: I1003 11:56:16.340714 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-dj8b7"] Oct 03 11:56:16 crc kubenswrapper[4990]: I1003 11:56:16.373042 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" event={"ID":"dd955a7e-39fc-4115-94d4-f578031c03ca","Type":"ContainerStarted","Data":"e495225d4edd04050f121df06c45745d9cf887626e57cc19ba440fffe28c5057"} Oct 03 11:56:18 crc kubenswrapper[4990]: I1003 11:56:18.392291 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" event={"ID":"dd955a7e-39fc-4115-94d4-f578031c03ca","Type":"ContainerStarted","Data":"2d492dcb9aeca33fe7e8635d39cdafc4d8a4216a458f6833e8dba83f4e00bbe9"} Oct 03 11:56:18 crc kubenswrapper[4990]: I1003 11:56:18.422462 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" podStartSLOduration=2.658445638 podStartE2EDuration="3.42244114s" podCreationTimestamp="2025-10-03 11:56:15 +0000 UTC" firstStartedPulling="2025-10-03 11:56:16.350083416 +0000 UTC m=+7958.146715273" lastFinishedPulling="2025-10-03 11:56:17.114078878 +0000 UTC m=+7958.910710775" observedRunningTime="2025-10-03 11:56:18.41238558 +0000 UTC m=+7960.209017467" watchObservedRunningTime="2025-10-03 11:56:18.42244114 +0000 UTC m=+7960.219073017" Oct 03 11:56:22 crc kubenswrapper[4990]: I1003 11:56:22.873661 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:56:22 crc kubenswrapper[4990]: E1003 11:56:22.876443 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:56:36 crc kubenswrapper[4990]: I1003 11:56:36.874112 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:56:36 crc kubenswrapper[4990]: E1003 11:56:36.875724 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:56:46 crc kubenswrapper[4990]: I1003 11:56:46.868334 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gp5p2"] Oct 03 11:56:46 crc kubenswrapper[4990]: I1003 11:56:46.875464 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:46 crc kubenswrapper[4990]: I1003 11:56:46.904026 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gp5p2"] Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.013109 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djbsf\" (UniqueName: \"kubernetes.io/projected/ebf539e5-f020-472e-9a82-1f59bd6f8c96-kube-api-access-djbsf\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.014319 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-catalog-content\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.014487 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-utilities\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.116249 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djbsf\" (UniqueName: \"kubernetes.io/projected/ebf539e5-f020-472e-9a82-1f59bd6f8c96-kube-api-access-djbsf\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.116854 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-catalog-content\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.117146 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-utilities\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.117654 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-utilities\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.117657 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-catalog-content\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.139625 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djbsf\" (UniqueName: \"kubernetes.io/projected/ebf539e5-f020-472e-9a82-1f59bd6f8c96-kube-api-access-djbsf\") pod \"community-operators-gp5p2\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.215011 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.752386 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gp5p2"] Oct 03 11:56:47 crc kubenswrapper[4990]: I1003 11:56:47.872243 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:56:47 crc kubenswrapper[4990]: E1003 11:56:47.872610 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:56:48 crc kubenswrapper[4990]: I1003 11:56:48.721669 4990 generic.go:334] "Generic (PLEG): container finished" podID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerID="c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd" exitCode=0 Oct 03 11:56:48 crc kubenswrapper[4990]: I1003 11:56:48.721738 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5p2" event={"ID":"ebf539e5-f020-472e-9a82-1f59bd6f8c96","Type":"ContainerDied","Data":"c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd"} Oct 03 11:56:48 crc kubenswrapper[4990]: I1003 11:56:48.722022 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5p2" event={"ID":"ebf539e5-f020-472e-9a82-1f59bd6f8c96","Type":"ContainerStarted","Data":"871f12f43f8ae7cfadc42fbed2e97ba1329882c700533f7251924912e703fd34"} Oct 03 11:56:50 crc kubenswrapper[4990]: I1003 11:56:50.744590 4990 generic.go:334] "Generic (PLEG): container finished" podID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerID="5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0" exitCode=0 Oct 03 11:56:50 crc kubenswrapper[4990]: I1003 11:56:50.745575 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5p2" event={"ID":"ebf539e5-f020-472e-9a82-1f59bd6f8c96","Type":"ContainerDied","Data":"5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0"} Oct 03 11:56:51 crc kubenswrapper[4990]: I1003 11:56:51.756294 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5p2" event={"ID":"ebf539e5-f020-472e-9a82-1f59bd6f8c96","Type":"ContainerStarted","Data":"3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8"} Oct 03 11:56:51 crc kubenswrapper[4990]: I1003 11:56:51.771862 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gp5p2" podStartSLOduration=3.286139635 podStartE2EDuration="5.771848035s" podCreationTimestamp="2025-10-03 11:56:46 +0000 UTC" firstStartedPulling="2025-10-03 11:56:48.724198183 +0000 UTC m=+7990.520830040" lastFinishedPulling="2025-10-03 11:56:51.209906543 +0000 UTC m=+7993.006538440" observedRunningTime="2025-10-03 11:56:51.770131931 +0000 UTC m=+7993.566763788" watchObservedRunningTime="2025-10-03 11:56:51.771848035 +0000 UTC m=+7993.568479882" Oct 03 11:56:57 crc kubenswrapper[4990]: I1003 11:56:57.216220 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:57 crc kubenswrapper[4990]: I1003 11:56:57.216989 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:57 crc kubenswrapper[4990]: I1003 11:56:57.271913 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:57 crc kubenswrapper[4990]: I1003 11:56:57.915420 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:56:57 crc kubenswrapper[4990]: I1003 11:56:57.986335 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gp5p2"] Oct 03 11:56:59 crc kubenswrapper[4990]: I1003 11:56:59.854195 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gp5p2" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerName="registry-server" containerID="cri-o://3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8" gracePeriod=2 Oct 03 11:56:59 crc kubenswrapper[4990]: I1003 11:56:59.873912 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:56:59 crc kubenswrapper[4990]: E1003 11:56:59.874293 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.387789 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.444486 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-catalog-content\") pod \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.444730 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djbsf\" (UniqueName: \"kubernetes.io/projected/ebf539e5-f020-472e-9a82-1f59bd6f8c96-kube-api-access-djbsf\") pod \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.444808 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-utilities\") pod \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\" (UID: \"ebf539e5-f020-472e-9a82-1f59bd6f8c96\") " Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.445678 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-utilities" (OuterVolumeSpecName: "utilities") pod "ebf539e5-f020-472e-9a82-1f59bd6f8c96" (UID: "ebf539e5-f020-472e-9a82-1f59bd6f8c96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.457776 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf539e5-f020-472e-9a82-1f59bd6f8c96-kube-api-access-djbsf" (OuterVolumeSpecName: "kube-api-access-djbsf") pod "ebf539e5-f020-472e-9a82-1f59bd6f8c96" (UID: "ebf539e5-f020-472e-9a82-1f59bd6f8c96"). InnerVolumeSpecName "kube-api-access-djbsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.548177 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.548764 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djbsf\" (UniqueName: \"kubernetes.io/projected/ebf539e5-f020-472e-9a82-1f59bd6f8c96-kube-api-access-djbsf\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.602205 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebf539e5-f020-472e-9a82-1f59bd6f8c96" (UID: "ebf539e5-f020-472e-9a82-1f59bd6f8c96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.649997 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf539e5-f020-472e-9a82-1f59bd6f8c96-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.867181 4990 generic.go:334] "Generic (PLEG): container finished" podID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerID="3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8" exitCode=0 Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.867265 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5p2" event={"ID":"ebf539e5-f020-472e-9a82-1f59bd6f8c96","Type":"ContainerDied","Data":"3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8"} Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.867298 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gp5p2" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.867474 4990 scope.go:117] "RemoveContainer" containerID="3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.867357 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gp5p2" event={"ID":"ebf539e5-f020-472e-9a82-1f59bd6f8c96","Type":"ContainerDied","Data":"871f12f43f8ae7cfadc42fbed2e97ba1329882c700533f7251924912e703fd34"} Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.908249 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gp5p2"] Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.909115 4990 scope.go:117] "RemoveContainer" containerID="5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0" Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.922443 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gp5p2"] Oct 03 11:57:00 crc kubenswrapper[4990]: I1003 11:57:00.939347 4990 scope.go:117] "RemoveContainer" containerID="c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd" Oct 03 11:57:01 crc kubenswrapper[4990]: I1003 11:57:01.007947 4990 scope.go:117] "RemoveContainer" containerID="3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8" Oct 03 11:57:01 crc kubenswrapper[4990]: E1003 11:57:01.008423 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8\": container with ID starting with 3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8 not found: ID does not exist" containerID="3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8" Oct 03 11:57:01 crc kubenswrapper[4990]: I1003 11:57:01.008614 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8"} err="failed to get container status \"3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8\": rpc error: code = NotFound desc = could not find container \"3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8\": container with ID starting with 3cf11d7671768302b9496e05bfbe732f1a327d74dca19c2e089b083700235ff8 not found: ID does not exist" Oct 03 11:57:01 crc kubenswrapper[4990]: I1003 11:57:01.008663 4990 scope.go:117] "RemoveContainer" containerID="5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0" Oct 03 11:57:01 crc kubenswrapper[4990]: E1003 11:57:01.008950 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0\": container with ID starting with 5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0 not found: ID does not exist" containerID="5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0" Oct 03 11:57:01 crc kubenswrapper[4990]: I1003 11:57:01.008984 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0"} err="failed to get container status \"5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0\": rpc error: code = NotFound desc = could not find container \"5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0\": container with ID starting with 5e1be291426745454e9e3174fb14784001afa17b0045f0ec81153bb43253c6e0 not found: ID does not exist" Oct 03 11:57:01 crc kubenswrapper[4990]: I1003 11:57:01.009010 4990 scope.go:117] "RemoveContainer" containerID="c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd" Oct 03 11:57:01 crc kubenswrapper[4990]: E1003 11:57:01.009398 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd\": container with ID starting with c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd not found: ID does not exist" containerID="c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd" Oct 03 11:57:01 crc kubenswrapper[4990]: I1003 11:57:01.009433 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd"} err="failed to get container status \"c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd\": rpc error: code = NotFound desc = could not find container \"c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd\": container with ID starting with c0abc6523d1f755492516901b6e3c961a352129ca04cbea3b93634dcf78294dd not found: ID does not exist" Oct 03 11:57:02 crc kubenswrapper[4990]: I1003 11:57:02.894571 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" path="/var/lib/kubelet/pods/ebf539e5-f020-472e-9a82-1f59bd6f8c96/volumes" Oct 03 11:57:10 crc kubenswrapper[4990]: I1003 11:57:10.871685 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:57:10 crc kubenswrapper[4990]: E1003 11:57:10.872504 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:57:12 crc kubenswrapper[4990]: I1003 11:57:12.027230 4990 generic.go:334] "Generic (PLEG): container finished" podID="dd955a7e-39fc-4115-94d4-f578031c03ca" containerID="2d492dcb9aeca33fe7e8635d39cdafc4d8a4216a458f6833e8dba83f4e00bbe9" exitCode=0 Oct 03 11:57:12 crc kubenswrapper[4990]: I1003 11:57:12.027292 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" event={"ID":"dd955a7e-39fc-4115-94d4-f578031c03ca","Type":"ContainerDied","Data":"2d492dcb9aeca33fe7e8635d39cdafc4d8a4216a458f6833e8dba83f4e00bbe9"} Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.509394 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.656333 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-ssh-key\") pod \"dd955a7e-39fc-4115-94d4-f578031c03ca\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.656424 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-ovn-metadata-agent-neutron-config-0\") pod \"dd955a7e-39fc-4115-94d4-f578031c03ca\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.656499 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-metadata-combined-ca-bundle\") pod \"dd955a7e-39fc-4115-94d4-f578031c03ca\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.656572 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-inventory\") pod \"dd955a7e-39fc-4115-94d4-f578031c03ca\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.656623 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-nova-metadata-neutron-config-0\") pod \"dd955a7e-39fc-4115-94d4-f578031c03ca\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.656686 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2dgk\" (UniqueName: \"kubernetes.io/projected/dd955a7e-39fc-4115-94d4-f578031c03ca-kube-api-access-w2dgk\") pod \"dd955a7e-39fc-4115-94d4-f578031c03ca\" (UID: \"dd955a7e-39fc-4115-94d4-f578031c03ca\") " Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.661504 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd955a7e-39fc-4115-94d4-f578031c03ca-kube-api-access-w2dgk" (OuterVolumeSpecName: "kube-api-access-w2dgk") pod "dd955a7e-39fc-4115-94d4-f578031c03ca" (UID: "dd955a7e-39fc-4115-94d4-f578031c03ca"). InnerVolumeSpecName "kube-api-access-w2dgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.665802 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dd955a7e-39fc-4115-94d4-f578031c03ca" (UID: "dd955a7e-39fc-4115-94d4-f578031c03ca"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.686934 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "dd955a7e-39fc-4115-94d4-f578031c03ca" (UID: "dd955a7e-39fc-4115-94d4-f578031c03ca"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.688695 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-inventory" (OuterVolumeSpecName: "inventory") pod "dd955a7e-39fc-4115-94d4-f578031c03ca" (UID: "dd955a7e-39fc-4115-94d4-f578031c03ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.693240 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd955a7e-39fc-4115-94d4-f578031c03ca" (UID: "dd955a7e-39fc-4115-94d4-f578031c03ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.695387 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "dd955a7e-39fc-4115-94d4-f578031c03ca" (UID: "dd955a7e-39fc-4115-94d4-f578031c03ca"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.759405 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.759427 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.759438 4990 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.759448 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2dgk\" (UniqueName: \"kubernetes.io/projected/dd955a7e-39fc-4115-94d4-f578031c03ca-kube-api-access-w2dgk\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.759457 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:13 crc kubenswrapper[4990]: I1003 11:57:13.759466 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/dd955a7e-39fc-4115-94d4-f578031c03ca-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.056446 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" event={"ID":"dd955a7e-39fc-4115-94d4-f578031c03ca","Type":"ContainerDied","Data":"e495225d4edd04050f121df06c45745d9cf887626e57cc19ba440fffe28c5057"} Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.056502 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e495225d4edd04050f121df06c45745d9cf887626e57cc19ba440fffe28c5057" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.056612 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-dj8b7" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.221326 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-zjd6k"] Oct 03 11:57:14 crc kubenswrapper[4990]: E1003 11:57:14.221884 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd955a7e-39fc-4115-94d4-f578031c03ca" containerName="neutron-metadata-openstack-openstack-cell1" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.221908 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd955a7e-39fc-4115-94d4-f578031c03ca" containerName="neutron-metadata-openstack-openstack-cell1" Oct 03 11:57:14 crc kubenswrapper[4990]: E1003 11:57:14.221925 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerName="registry-server" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.221935 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerName="registry-server" Oct 03 11:57:14 crc kubenswrapper[4990]: E1003 11:57:14.221954 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerName="extract-content" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.221961 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerName="extract-content" Oct 03 11:57:14 crc kubenswrapper[4990]: E1003 11:57:14.221981 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerName="extract-utilities" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.221989 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerName="extract-utilities" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.222249 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd955a7e-39fc-4115-94d4-f578031c03ca" containerName="neutron-metadata-openstack-openstack-cell1" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.222266 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf539e5-f020-472e-9a82-1f59bd6f8c96" containerName="registry-server" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.223240 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.225476 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.225468 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.227349 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.227900 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.228130 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.236679 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-zjd6k"] Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.375013 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.375217 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-ssh-key\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.375266 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.375307 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbmpn\" (UniqueName: \"kubernetes.io/projected/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-kube-api-access-wbmpn\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.375338 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-inventory\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.477716 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbmpn\" (UniqueName: \"kubernetes.io/projected/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-kube-api-access-wbmpn\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.477826 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-inventory\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.477903 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.478164 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-ssh-key\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.478304 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.481616 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-inventory\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.481803 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.482193 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.484906 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-ssh-key\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.508050 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbmpn\" (UniqueName: \"kubernetes.io/projected/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-kube-api-access-wbmpn\") pod \"libvirt-openstack-openstack-cell1-zjd6k\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:14 crc kubenswrapper[4990]: I1003 11:57:14.544281 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 11:57:15 crc kubenswrapper[4990]: I1003 11:57:15.112082 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-zjd6k"] Oct 03 11:57:15 crc kubenswrapper[4990]: I1003 11:57:15.113240 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 11:57:16 crc kubenswrapper[4990]: I1003 11:57:16.081804 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" event={"ID":"724cbe0c-9b28-4fc2-acdc-b67f8518acfe","Type":"ContainerStarted","Data":"715706971e6e078fdc6a1c936cdb5eaddbb0d54af3227e043be776f69f8d58a8"} Oct 03 11:57:16 crc kubenswrapper[4990]: I1003 11:57:16.082564 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" event={"ID":"724cbe0c-9b28-4fc2-acdc-b67f8518acfe","Type":"ContainerStarted","Data":"31986456a37cde2e8d5a74e30079a63acd1e9a0148dfc73b9bae0fec4e119f3a"} Oct 03 11:57:16 crc kubenswrapper[4990]: I1003 11:57:16.108824 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" podStartSLOduration=1.48140341 podStartE2EDuration="2.108803664s" podCreationTimestamp="2025-10-03 11:57:14 +0000 UTC" firstStartedPulling="2025-10-03 11:57:15.112788129 +0000 UTC m=+8016.909419996" lastFinishedPulling="2025-10-03 11:57:15.740188393 +0000 UTC m=+8017.536820250" observedRunningTime="2025-10-03 11:57:16.099716229 +0000 UTC m=+8017.896348086" watchObservedRunningTime="2025-10-03 11:57:16.108803664 +0000 UTC m=+8017.905435521" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.144920 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8d9tw"] Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.149366 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.167663 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d9tw"] Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.334362 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-utilities\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.334484 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-catalog-content\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.334542 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrg6\" (UniqueName: \"kubernetes.io/projected/d515e996-729b-44e2-9b89-025d6e1c86c4-kube-api-access-fcrg6\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.436666 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-utilities\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.436813 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-catalog-content\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.436843 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcrg6\" (UniqueName: \"kubernetes.io/projected/d515e996-729b-44e2-9b89-025d6e1c86c4-kube-api-access-fcrg6\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.437136 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-utilities\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.437452 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-catalog-content\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.463166 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcrg6\" (UniqueName: \"kubernetes.io/projected/d515e996-729b-44e2-9b89-025d6e1c86c4-kube-api-access-fcrg6\") pod \"certified-operators-8d9tw\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.474231 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:25 crc kubenswrapper[4990]: I1003 11:57:25.876481 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:57:25 crc kubenswrapper[4990]: E1003 11:57:25.877346 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:57:26 crc kubenswrapper[4990]: I1003 11:57:26.070344 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d9tw"] Oct 03 11:57:26 crc kubenswrapper[4990]: I1003 11:57:26.200411 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d9tw" event={"ID":"d515e996-729b-44e2-9b89-025d6e1c86c4","Type":"ContainerStarted","Data":"a5c1a2742150e32d65222b6b865cf7ae13870d371b6027729d1269c6f0d6b735"} Oct 03 11:57:27 crc kubenswrapper[4990]: I1003 11:57:27.216374 4990 generic.go:334] "Generic (PLEG): container finished" podID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerID="1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806" exitCode=0 Oct 03 11:57:27 crc kubenswrapper[4990]: I1003 11:57:27.216491 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d9tw" event={"ID":"d515e996-729b-44e2-9b89-025d6e1c86c4","Type":"ContainerDied","Data":"1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806"} Oct 03 11:57:28 crc kubenswrapper[4990]: I1003 11:57:28.230041 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d9tw" event={"ID":"d515e996-729b-44e2-9b89-025d6e1c86c4","Type":"ContainerStarted","Data":"4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065"} Oct 03 11:57:29 crc kubenswrapper[4990]: I1003 11:57:29.244412 4990 generic.go:334] "Generic (PLEG): container finished" podID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerID="4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065" exitCode=0 Oct 03 11:57:29 crc kubenswrapper[4990]: I1003 11:57:29.244495 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d9tw" event={"ID":"d515e996-729b-44e2-9b89-025d6e1c86c4","Type":"ContainerDied","Data":"4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065"} Oct 03 11:57:31 crc kubenswrapper[4990]: I1003 11:57:31.275436 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d9tw" event={"ID":"d515e996-729b-44e2-9b89-025d6e1c86c4","Type":"ContainerStarted","Data":"fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc"} Oct 03 11:57:31 crc kubenswrapper[4990]: I1003 11:57:31.309622 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8d9tw" podStartSLOduration=3.394779426 podStartE2EDuration="6.309575366s" podCreationTimestamp="2025-10-03 11:57:25 +0000 UTC" firstStartedPulling="2025-10-03 11:57:27.220629951 +0000 UTC m=+8029.017261808" lastFinishedPulling="2025-10-03 11:57:30.135425881 +0000 UTC m=+8031.932057748" observedRunningTime="2025-10-03 11:57:31.297322779 +0000 UTC m=+8033.093954686" watchObservedRunningTime="2025-10-03 11:57:31.309575366 +0000 UTC m=+8033.106207253" Oct 03 11:57:35 crc kubenswrapper[4990]: I1003 11:57:35.474793 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:35 crc kubenswrapper[4990]: I1003 11:57:35.475341 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:35 crc kubenswrapper[4990]: I1003 11:57:35.541006 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:36 crc kubenswrapper[4990]: I1003 11:57:36.401394 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:36 crc kubenswrapper[4990]: I1003 11:57:36.450675 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d9tw"] Oct 03 11:57:37 crc kubenswrapper[4990]: I1003 11:57:37.872265 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:57:37 crc kubenswrapper[4990]: E1003 11:57:37.873992 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:57:38 crc kubenswrapper[4990]: I1003 11:57:38.361659 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8d9tw" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerName="registry-server" containerID="cri-o://fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc" gracePeriod=2 Oct 03 11:57:38 crc kubenswrapper[4990]: I1003 11:57:38.814565 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:38 crc kubenswrapper[4990]: I1003 11:57:38.944216 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-catalog-content\") pod \"d515e996-729b-44e2-9b89-025d6e1c86c4\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " Oct 03 11:57:38 crc kubenswrapper[4990]: I1003 11:57:38.944685 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-utilities\") pod \"d515e996-729b-44e2-9b89-025d6e1c86c4\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " Oct 03 11:57:38 crc kubenswrapper[4990]: I1003 11:57:38.944793 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcrg6\" (UniqueName: \"kubernetes.io/projected/d515e996-729b-44e2-9b89-025d6e1c86c4-kube-api-access-fcrg6\") pod \"d515e996-729b-44e2-9b89-025d6e1c86c4\" (UID: \"d515e996-729b-44e2-9b89-025d6e1c86c4\") " Oct 03 11:57:38 crc kubenswrapper[4990]: I1003 11:57:38.945465 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-utilities" (OuterVolumeSpecName: "utilities") pod "d515e996-729b-44e2-9b89-025d6e1c86c4" (UID: "d515e996-729b-44e2-9b89-025d6e1c86c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:57:38 crc kubenswrapper[4990]: I1003 11:57:38.949937 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d515e996-729b-44e2-9b89-025d6e1c86c4-kube-api-access-fcrg6" (OuterVolumeSpecName: "kube-api-access-fcrg6") pod "d515e996-729b-44e2-9b89-025d6e1c86c4" (UID: "d515e996-729b-44e2-9b89-025d6e1c86c4"). InnerVolumeSpecName "kube-api-access-fcrg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:57:38 crc kubenswrapper[4990]: I1003 11:57:38.990176 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d515e996-729b-44e2-9b89-025d6e1c86c4" (UID: "d515e996-729b-44e2-9b89-025d6e1c86c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.051706 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.051749 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcrg6\" (UniqueName: \"kubernetes.io/projected/d515e996-729b-44e2-9b89-025d6e1c86c4-kube-api-access-fcrg6\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.051764 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d515e996-729b-44e2-9b89-025d6e1c86c4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.376442 4990 generic.go:334] "Generic (PLEG): container finished" podID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerID="fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc" exitCode=0 Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.376563 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d9tw" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.376567 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d9tw" event={"ID":"d515e996-729b-44e2-9b89-025d6e1c86c4","Type":"ContainerDied","Data":"fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc"} Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.377007 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d9tw" event={"ID":"d515e996-729b-44e2-9b89-025d6e1c86c4","Type":"ContainerDied","Data":"a5c1a2742150e32d65222b6b865cf7ae13870d371b6027729d1269c6f0d6b735"} Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.377038 4990 scope.go:117] "RemoveContainer" containerID="fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.412756 4990 scope.go:117] "RemoveContainer" containerID="4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.417626 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d9tw"] Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.426018 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8d9tw"] Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.443411 4990 scope.go:117] "RemoveContainer" containerID="1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.488347 4990 scope.go:117] "RemoveContainer" containerID="fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc" Oct 03 11:57:39 crc kubenswrapper[4990]: E1003 11:57:39.488758 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc\": container with ID starting with fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc not found: ID does not exist" containerID="fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.488806 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc"} err="failed to get container status \"fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc\": rpc error: code = NotFound desc = could not find container \"fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc\": container with ID starting with fdd400b56ade00b183c50fcb736c77e968a4fa7fd2acdfd5c5e8dca4a05018dc not found: ID does not exist" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.488844 4990 scope.go:117] "RemoveContainer" containerID="4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065" Oct 03 11:57:39 crc kubenswrapper[4990]: E1003 11:57:39.489296 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065\": container with ID starting with 4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065 not found: ID does not exist" containerID="4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.490002 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065"} err="failed to get container status \"4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065\": rpc error: code = NotFound desc = could not find container \"4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065\": container with ID starting with 4c67997990fd9b5392a86f6d91f3e027c9f611c4c2c66d0a8291034cba74c065 not found: ID does not exist" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.490252 4990 scope.go:117] "RemoveContainer" containerID="1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806" Oct 03 11:57:39 crc kubenswrapper[4990]: E1003 11:57:39.490783 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806\": container with ID starting with 1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806 not found: ID does not exist" containerID="1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806" Oct 03 11:57:39 crc kubenswrapper[4990]: I1003 11:57:39.490818 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806"} err="failed to get container status \"1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806\": rpc error: code = NotFound desc = could not find container \"1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806\": container with ID starting with 1cd2415677d67d0996c000891c63285e9178720c231eb86d87f367a07dbc2806 not found: ID does not exist" Oct 03 11:57:40 crc kubenswrapper[4990]: I1003 11:57:40.894548 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" path="/var/lib/kubelet/pods/d515e996-729b-44e2-9b89-025d6e1c86c4/volumes" Oct 03 11:57:51 crc kubenswrapper[4990]: I1003 11:57:51.873939 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:57:51 crc kubenswrapper[4990]: E1003 11:57:51.875443 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:58:02 crc kubenswrapper[4990]: I1003 11:58:02.872199 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:58:02 crc kubenswrapper[4990]: E1003 11:58:02.874193 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:58:15 crc kubenswrapper[4990]: I1003 11:58:15.872144 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:58:15 crc kubenswrapper[4990]: E1003 11:58:15.874627 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 11:58:30 crc kubenswrapper[4990]: I1003 11:58:30.873022 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 11:58:32 crc kubenswrapper[4990]: I1003 11:58:32.014287 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"0700b4ff5885627ac6d798ab72cfd8e35ad037ac18f0753b1777583c22342f6d"} Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.194019 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8zr7p"] Oct 03 11:59:41 crc kubenswrapper[4990]: E1003 11:59:41.195405 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerName="extract-utilities" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.195418 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerName="extract-utilities" Oct 03 11:59:41 crc kubenswrapper[4990]: E1003 11:59:41.195438 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerName="extract-content" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.195444 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerName="extract-content" Oct 03 11:59:41 crc kubenswrapper[4990]: E1003 11:59:41.195468 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerName="registry-server" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.195476 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerName="registry-server" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.195713 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="d515e996-729b-44e2-9b89-025d6e1c86c4" containerName="registry-server" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.197232 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.222590 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zr7p"] Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.309376 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-utilities\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.309499 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d796l\" (UniqueName: \"kubernetes.io/projected/daca4d5a-2f69-46bd-a70a-44a824d355fe-kube-api-access-d796l\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.309623 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-catalog-content\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.411372 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-utilities\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.411504 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d796l\" (UniqueName: \"kubernetes.io/projected/daca4d5a-2f69-46bd-a70a-44a824d355fe-kube-api-access-d796l\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.411579 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-catalog-content\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.411948 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-utilities\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.412078 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-catalog-content\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.441147 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d796l\" (UniqueName: \"kubernetes.io/projected/daca4d5a-2f69-46bd-a70a-44a824d355fe-kube-api-access-d796l\") pod \"redhat-marketplace-8zr7p\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:41 crc kubenswrapper[4990]: I1003 11:59:41.516837 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:42 crc kubenswrapper[4990]: I1003 11:59:42.025286 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zr7p"] Oct 03 11:59:42 crc kubenswrapper[4990]: I1003 11:59:42.776856 4990 generic.go:334] "Generic (PLEG): container finished" podID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerID="8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5" exitCode=0 Oct 03 11:59:42 crc kubenswrapper[4990]: I1003 11:59:42.776959 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zr7p" event={"ID":"daca4d5a-2f69-46bd-a70a-44a824d355fe","Type":"ContainerDied","Data":"8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5"} Oct 03 11:59:42 crc kubenswrapper[4990]: I1003 11:59:42.777366 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zr7p" event={"ID":"daca4d5a-2f69-46bd-a70a-44a824d355fe","Type":"ContainerStarted","Data":"c96128bb9585c0f323af70a29cc0c6b57c7a754d6c0887cb7210f9f56c9e93b4"} Oct 03 11:59:44 crc kubenswrapper[4990]: I1003 11:59:44.799272 4990 generic.go:334] "Generic (PLEG): container finished" podID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerID="bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388" exitCode=0 Oct 03 11:59:44 crc kubenswrapper[4990]: I1003 11:59:44.799476 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zr7p" event={"ID":"daca4d5a-2f69-46bd-a70a-44a824d355fe","Type":"ContainerDied","Data":"bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388"} Oct 03 11:59:45 crc kubenswrapper[4990]: I1003 11:59:45.811139 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zr7p" event={"ID":"daca4d5a-2f69-46bd-a70a-44a824d355fe","Type":"ContainerStarted","Data":"0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8"} Oct 03 11:59:45 crc kubenswrapper[4990]: I1003 11:59:45.837463 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8zr7p" podStartSLOduration=2.408179148 podStartE2EDuration="4.837444869s" podCreationTimestamp="2025-10-03 11:59:41 +0000 UTC" firstStartedPulling="2025-10-03 11:59:42.778821093 +0000 UTC m=+8164.575452950" lastFinishedPulling="2025-10-03 11:59:45.208086804 +0000 UTC m=+8167.004718671" observedRunningTime="2025-10-03 11:59:45.82932995 +0000 UTC m=+8167.625961807" watchObservedRunningTime="2025-10-03 11:59:45.837444869 +0000 UTC m=+8167.634076726" Oct 03 11:59:51 crc kubenswrapper[4990]: I1003 11:59:51.516948 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:51 crc kubenswrapper[4990]: I1003 11:59:51.517723 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:51 crc kubenswrapper[4990]: I1003 11:59:51.591371 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:51 crc kubenswrapper[4990]: I1003 11:59:51.947427 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:51 crc kubenswrapper[4990]: I1003 11:59:51.999368 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zr7p"] Oct 03 11:59:53 crc kubenswrapper[4990]: I1003 11:59:53.907597 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8zr7p" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerName="registry-server" containerID="cri-o://0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8" gracePeriod=2 Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.356864 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.417816 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-catalog-content\") pod \"daca4d5a-2f69-46bd-a70a-44a824d355fe\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.417911 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d796l\" (UniqueName: \"kubernetes.io/projected/daca4d5a-2f69-46bd-a70a-44a824d355fe-kube-api-access-d796l\") pod \"daca4d5a-2f69-46bd-a70a-44a824d355fe\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.418118 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-utilities\") pod \"daca4d5a-2f69-46bd-a70a-44a824d355fe\" (UID: \"daca4d5a-2f69-46bd-a70a-44a824d355fe\") " Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.418877 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-utilities" (OuterVolumeSpecName: "utilities") pod "daca4d5a-2f69-46bd-a70a-44a824d355fe" (UID: "daca4d5a-2f69-46bd-a70a-44a824d355fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.430664 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daca4d5a-2f69-46bd-a70a-44a824d355fe-kube-api-access-d796l" (OuterVolumeSpecName: "kube-api-access-d796l") pod "daca4d5a-2f69-46bd-a70a-44a824d355fe" (UID: "daca4d5a-2f69-46bd-a70a-44a824d355fe"). InnerVolumeSpecName "kube-api-access-d796l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.442764 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "daca4d5a-2f69-46bd-a70a-44a824d355fe" (UID: "daca4d5a-2f69-46bd-a70a-44a824d355fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.521431 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.521761 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d796l\" (UniqueName: \"kubernetes.io/projected/daca4d5a-2f69-46bd-a70a-44a824d355fe-kube-api-access-d796l\") on node \"crc\" DevicePath \"\"" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.521853 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daca4d5a-2f69-46bd-a70a-44a824d355fe-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.947760 4990 generic.go:334] "Generic (PLEG): container finished" podID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerID="0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8" exitCode=0 Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.947808 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zr7p" event={"ID":"daca4d5a-2f69-46bd-a70a-44a824d355fe","Type":"ContainerDied","Data":"0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8"} Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.947834 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zr7p" event={"ID":"daca4d5a-2f69-46bd-a70a-44a824d355fe","Type":"ContainerDied","Data":"c96128bb9585c0f323af70a29cc0c6b57c7a754d6c0887cb7210f9f56c9e93b4"} Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.947852 4990 scope.go:117] "RemoveContainer" containerID="0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.948016 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zr7p" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.983258 4990 scope.go:117] "RemoveContainer" containerID="bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388" Oct 03 11:59:54 crc kubenswrapper[4990]: I1003 11:59:54.985458 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zr7p"] Oct 03 11:59:55 crc kubenswrapper[4990]: I1003 11:59:55.002839 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zr7p"] Oct 03 11:59:55 crc kubenswrapper[4990]: I1003 11:59:55.016281 4990 scope.go:117] "RemoveContainer" containerID="8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5" Oct 03 11:59:55 crc kubenswrapper[4990]: I1003 11:59:55.053803 4990 scope.go:117] "RemoveContainer" containerID="0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8" Oct 03 11:59:55 crc kubenswrapper[4990]: E1003 11:59:55.054284 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8\": container with ID starting with 0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8 not found: ID does not exist" containerID="0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8" Oct 03 11:59:55 crc kubenswrapper[4990]: I1003 11:59:55.054330 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8"} err="failed to get container status \"0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8\": rpc error: code = NotFound desc = could not find container \"0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8\": container with ID starting with 0d71810a64a0739ed3fd06c6b52a4323982f9a7307d8ea3538d325f054a9d7e8 not found: ID does not exist" Oct 03 11:59:55 crc kubenswrapper[4990]: I1003 11:59:55.054358 4990 scope.go:117] "RemoveContainer" containerID="bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388" Oct 03 11:59:55 crc kubenswrapper[4990]: E1003 11:59:55.054931 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388\": container with ID starting with bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388 not found: ID does not exist" containerID="bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388" Oct 03 11:59:55 crc kubenswrapper[4990]: I1003 11:59:55.055036 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388"} err="failed to get container status \"bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388\": rpc error: code = NotFound desc = could not find container \"bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388\": container with ID starting with bd054200791cf7f78408063cc9cf943607d8898113ac68e2bf847b87b010a388 not found: ID does not exist" Oct 03 11:59:55 crc kubenswrapper[4990]: I1003 11:59:55.055110 4990 scope.go:117] "RemoveContainer" containerID="8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5" Oct 03 11:59:55 crc kubenswrapper[4990]: E1003 11:59:55.055509 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5\": container with ID starting with 8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5 not found: ID does not exist" containerID="8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5" Oct 03 11:59:55 crc kubenswrapper[4990]: I1003 11:59:55.055568 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5"} err="failed to get container status \"8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5\": rpc error: code = NotFound desc = could not find container \"8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5\": container with ID starting with 8acd7a3c4a62c264e7603952e2bafdd32fc779b5e39ae03530615fae75711ba5 not found: ID does not exist" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.414867 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zw6sn"] Oct 03 11:59:56 crc kubenswrapper[4990]: E1003 11:59:56.415732 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerName="extract-utilities" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.415750 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerName="extract-utilities" Oct 03 11:59:56 crc kubenswrapper[4990]: E1003 11:59:56.415778 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerName="registry-server" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.415786 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerName="registry-server" Oct 03 11:59:56 crc kubenswrapper[4990]: E1003 11:59:56.415827 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerName="extract-content" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.415835 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerName="extract-content" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.416101 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" containerName="registry-server" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.418257 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.437264 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zw6sn"] Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.569292 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-utilities\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.569839 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcsc4\" (UniqueName: \"kubernetes.io/projected/716d0452-65c5-496a-866c-3a9dd5643d87-kube-api-access-mcsc4\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.570101 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-catalog-content\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.672438 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-utilities\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.672531 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcsc4\" (UniqueName: \"kubernetes.io/projected/716d0452-65c5-496a-866c-3a9dd5643d87-kube-api-access-mcsc4\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.672643 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-catalog-content\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.672968 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-utilities\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.673228 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-catalog-content\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.691445 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcsc4\" (UniqueName: \"kubernetes.io/projected/716d0452-65c5-496a-866c-3a9dd5643d87-kube-api-access-mcsc4\") pod \"redhat-operators-zw6sn\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.761039 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 11:59:56 crc kubenswrapper[4990]: I1003 11:59:56.919617 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daca4d5a-2f69-46bd-a70a-44a824d355fe" path="/var/lib/kubelet/pods/daca4d5a-2f69-46bd-a70a-44a824d355fe/volumes" Oct 03 11:59:57 crc kubenswrapper[4990]: I1003 11:59:57.286766 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zw6sn"] Oct 03 11:59:57 crc kubenswrapper[4990]: I1003 11:59:57.983867 4990 generic.go:334] "Generic (PLEG): container finished" podID="716d0452-65c5-496a-866c-3a9dd5643d87" containerID="1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941" exitCode=0 Oct 03 11:59:57 crc kubenswrapper[4990]: I1003 11:59:57.983943 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw6sn" event={"ID":"716d0452-65c5-496a-866c-3a9dd5643d87","Type":"ContainerDied","Data":"1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941"} Oct 03 11:59:57 crc kubenswrapper[4990]: I1003 11:59:57.984479 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw6sn" event={"ID":"716d0452-65c5-496a-866c-3a9dd5643d87","Type":"ContainerStarted","Data":"53a531f9d95390e956d5afceb8a49ac7eb7111bdff1feddf71235d43d13fb20b"} Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.010968 4990 generic.go:334] "Generic (PLEG): container finished" podID="716d0452-65c5-496a-866c-3a9dd5643d87" containerID="f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee" exitCode=0 Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.011063 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw6sn" event={"ID":"716d0452-65c5-496a-866c-3a9dd5643d87","Type":"ContainerDied","Data":"f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee"} Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.159352 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk"] Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.161861 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.165601 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.166898 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.170824 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk"] Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.275310 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfk94\" (UniqueName: \"kubernetes.io/projected/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-kube-api-access-dfk94\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.275571 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-secret-volume\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.275620 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-config-volume\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.377237 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-secret-volume\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.377288 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-config-volume\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.377367 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfk94\" (UniqueName: \"kubernetes.io/projected/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-kube-api-access-dfk94\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.378589 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-config-volume\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.384318 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-secret-volume\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.394976 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfk94\" (UniqueName: \"kubernetes.io/projected/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-kube-api-access-dfk94\") pod \"collect-profiles-29324880-dxrgk\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:00 crc kubenswrapper[4990]: I1003 12:00:00.483650 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:01 crc kubenswrapper[4990]: I1003 12:00:01.021490 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw6sn" event={"ID":"716d0452-65c5-496a-866c-3a9dd5643d87","Type":"ContainerStarted","Data":"041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46"} Oct 03 12:00:01 crc kubenswrapper[4990]: I1003 12:00:01.038396 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zw6sn" podStartSLOduration=2.494798513 podStartE2EDuration="5.038368637s" podCreationTimestamp="2025-10-03 11:59:56 +0000 UTC" firstStartedPulling="2025-10-03 11:59:57.986366592 +0000 UTC m=+8179.782998489" lastFinishedPulling="2025-10-03 12:00:00.529936756 +0000 UTC m=+8182.326568613" observedRunningTime="2025-10-03 12:00:01.036885559 +0000 UTC m=+8182.833517416" watchObservedRunningTime="2025-10-03 12:00:01.038368637 +0000 UTC m=+8182.835000494" Oct 03 12:00:01 crc kubenswrapper[4990]: I1003 12:00:01.163404 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk"] Oct 03 12:00:02 crc kubenswrapper[4990]: I1003 12:00:02.031950 4990 generic.go:334] "Generic (PLEG): container finished" podID="b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8" containerID="690cb34e77af049c0c38feaaee5944f91cb3ea860435a60cf806c722970e4d2c" exitCode=0 Oct 03 12:00:02 crc kubenswrapper[4990]: I1003 12:00:02.032026 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" event={"ID":"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8","Type":"ContainerDied","Data":"690cb34e77af049c0c38feaaee5944f91cb3ea860435a60cf806c722970e4d2c"} Oct 03 12:00:02 crc kubenswrapper[4990]: I1003 12:00:02.032355 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" event={"ID":"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8","Type":"ContainerStarted","Data":"7d09af68334ecd6eb61709d554fd1d22fc681569aa2a9841f93108955e6f2d83"} Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.437014 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.562429 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-secret-volume\") pod \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.562607 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-config-volume\") pod \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.562770 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfk94\" (UniqueName: \"kubernetes.io/projected/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-kube-api-access-dfk94\") pod \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\" (UID: \"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8\") " Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.563224 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8" (UID: "b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.563886 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.569281 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-kube-api-access-dfk94" (OuterVolumeSpecName: "kube-api-access-dfk94") pod "b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8" (UID: "b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8"). InnerVolumeSpecName "kube-api-access-dfk94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.569307 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8" (UID: "b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.665705 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:00:03 crc kubenswrapper[4990]: I1003 12:00:03.665743 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfk94\" (UniqueName: \"kubernetes.io/projected/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8-kube-api-access-dfk94\") on node \"crc\" DevicePath \"\"" Oct 03 12:00:04 crc kubenswrapper[4990]: I1003 12:00:04.052464 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" event={"ID":"b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8","Type":"ContainerDied","Data":"7d09af68334ecd6eb61709d554fd1d22fc681569aa2a9841f93108955e6f2d83"} Oct 03 12:00:04 crc kubenswrapper[4990]: I1003 12:00:04.052576 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk" Oct 03 12:00:04 crc kubenswrapper[4990]: I1003 12:00:04.052563 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d09af68334ecd6eb61709d554fd1d22fc681569aa2a9841f93108955e6f2d83" Oct 03 12:00:04 crc kubenswrapper[4990]: I1003 12:00:04.517993 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s"] Oct 03 12:00:04 crc kubenswrapper[4990]: I1003 12:00:04.531446 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324835-76f4s"] Oct 03 12:00:04 crc kubenswrapper[4990]: I1003 12:00:04.884939 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab766225-b91b-4563-94fd-6b5a679ec419" path="/var/lib/kubelet/pods/ab766225-b91b-4563-94fd-6b5a679ec419/volumes" Oct 03 12:00:06 crc kubenswrapper[4990]: I1003 12:00:06.761283 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 12:00:06 crc kubenswrapper[4990]: I1003 12:00:06.761693 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 12:00:06 crc kubenswrapper[4990]: I1003 12:00:06.865910 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 12:00:07 crc kubenswrapper[4990]: I1003 12:00:07.153443 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 12:00:07 crc kubenswrapper[4990]: I1003 12:00:07.204767 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zw6sn"] Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.105495 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zw6sn" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" containerName="registry-server" containerID="cri-o://041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46" gracePeriod=2 Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.598790 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.692180 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-utilities\") pod \"716d0452-65c5-496a-866c-3a9dd5643d87\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.692606 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-catalog-content\") pod \"716d0452-65c5-496a-866c-3a9dd5643d87\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.692636 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcsc4\" (UniqueName: \"kubernetes.io/projected/716d0452-65c5-496a-866c-3a9dd5643d87-kube-api-access-mcsc4\") pod \"716d0452-65c5-496a-866c-3a9dd5643d87\" (UID: \"716d0452-65c5-496a-866c-3a9dd5643d87\") " Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.693526 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-utilities" (OuterVolumeSpecName: "utilities") pod "716d0452-65c5-496a-866c-3a9dd5643d87" (UID: "716d0452-65c5-496a-866c-3a9dd5643d87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.699054 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716d0452-65c5-496a-866c-3a9dd5643d87-kube-api-access-mcsc4" (OuterVolumeSpecName: "kube-api-access-mcsc4") pod "716d0452-65c5-496a-866c-3a9dd5643d87" (UID: "716d0452-65c5-496a-866c-3a9dd5643d87"). InnerVolumeSpecName "kube-api-access-mcsc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.775736 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "716d0452-65c5-496a-866c-3a9dd5643d87" (UID: "716d0452-65c5-496a-866c-3a9dd5643d87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.794990 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.795020 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716d0452-65c5-496a-866c-3a9dd5643d87-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:00:09 crc kubenswrapper[4990]: I1003 12:00:09.795042 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcsc4\" (UniqueName: \"kubernetes.io/projected/716d0452-65c5-496a-866c-3a9dd5643d87-kube-api-access-mcsc4\") on node \"crc\" DevicePath \"\"" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.119165 4990 generic.go:334] "Generic (PLEG): container finished" podID="716d0452-65c5-496a-866c-3a9dd5643d87" containerID="041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46" exitCode=0 Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.119223 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zw6sn" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.119234 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw6sn" event={"ID":"716d0452-65c5-496a-866c-3a9dd5643d87","Type":"ContainerDied","Data":"041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46"} Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.119306 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zw6sn" event={"ID":"716d0452-65c5-496a-866c-3a9dd5643d87","Type":"ContainerDied","Data":"53a531f9d95390e956d5afceb8a49ac7eb7111bdff1feddf71235d43d13fb20b"} Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.119342 4990 scope.go:117] "RemoveContainer" containerID="041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.159454 4990 scope.go:117] "RemoveContainer" containerID="f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.160147 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zw6sn"] Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.168779 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zw6sn"] Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.185140 4990 scope.go:117] "RemoveContainer" containerID="1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.233231 4990 scope.go:117] "RemoveContainer" containerID="041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46" Oct 03 12:00:10 crc kubenswrapper[4990]: E1003 12:00:10.234064 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46\": container with ID starting with 041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46 not found: ID does not exist" containerID="041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.234106 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46"} err="failed to get container status \"041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46\": rpc error: code = NotFound desc = could not find container \"041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46\": container with ID starting with 041db3bd0c7d54d097cb92e74a33ab35387e0a8541674e95219383fe2216ff46 not found: ID does not exist" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.234127 4990 scope.go:117] "RemoveContainer" containerID="f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee" Oct 03 12:00:10 crc kubenswrapper[4990]: E1003 12:00:10.234401 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee\": container with ID starting with f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee not found: ID does not exist" containerID="f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.234418 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee"} err="failed to get container status \"f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee\": rpc error: code = NotFound desc = could not find container \"f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee\": container with ID starting with f85ac0c1ff4c0998fcfab5a6316b18538f6b511137e5d79fabbb493bc78f3bee not found: ID does not exist" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.234430 4990 scope.go:117] "RemoveContainer" containerID="1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941" Oct 03 12:00:10 crc kubenswrapper[4990]: E1003 12:00:10.234758 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941\": container with ID starting with 1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941 not found: ID does not exist" containerID="1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.234797 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941"} err="failed to get container status \"1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941\": rpc error: code = NotFound desc = could not find container \"1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941\": container with ID starting with 1b38b3d00090036e9575840df1fa5470a05e667a6f221a7653e40e668e372941 not found: ID does not exist" Oct 03 12:00:10 crc kubenswrapper[4990]: I1003 12:00:10.883883 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" path="/var/lib/kubelet/pods/716d0452-65c5-496a-866c-3a9dd5643d87/volumes" Oct 03 12:00:54 crc kubenswrapper[4990]: I1003 12:00:54.329616 4990 scope.go:117] "RemoveContainer" containerID="485690513bc470ca3e1e6af02639d1407d43dd91380520a3b247a7ce9e4ab225" Oct 03 12:00:55 crc kubenswrapper[4990]: I1003 12:00:55.303911 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:00:55 crc kubenswrapper[4990]: I1003 12:00:55.303996 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.196409 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29324881-h7424"] Oct 03 12:01:00 crc kubenswrapper[4990]: E1003 12:01:00.197924 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" containerName="extract-utilities" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.197948 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" containerName="extract-utilities" Oct 03 12:01:00 crc kubenswrapper[4990]: E1003 12:01:00.197972 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8" containerName="collect-profiles" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.197984 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8" containerName="collect-profiles" Oct 03 12:01:00 crc kubenswrapper[4990]: E1003 12:01:00.198029 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" containerName="registry-server" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.198042 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" containerName="registry-server" Oct 03 12:01:00 crc kubenswrapper[4990]: E1003 12:01:00.198092 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" containerName="extract-content" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.198101 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" containerName="extract-content" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.198440 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8" containerName="collect-profiles" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.198479 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="716d0452-65c5-496a-866c-3a9dd5643d87" containerName="registry-server" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.199758 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.222410 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324881-h7424"] Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.332922 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fv97\" (UniqueName: \"kubernetes.io/projected/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-kube-api-access-8fv97\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.332988 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-config-data\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.333009 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-combined-ca-bundle\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.333323 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-fernet-keys\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.435862 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-config-data\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.435939 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-combined-ca-bundle\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.436243 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-fernet-keys\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.436298 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fv97\" (UniqueName: \"kubernetes.io/projected/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-kube-api-access-8fv97\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.442790 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-fernet-keys\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.443338 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-combined-ca-bundle\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.443681 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-config-data\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.455305 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fv97\" (UniqueName: \"kubernetes.io/projected/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-kube-api-access-8fv97\") pod \"keystone-cron-29324881-h7424\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:00 crc kubenswrapper[4990]: I1003 12:01:00.524903 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:01 crc kubenswrapper[4990]: I1003 12:01:01.052596 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324881-h7424"] Oct 03 12:01:01 crc kubenswrapper[4990]: I1003 12:01:01.762433 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324881-h7424" event={"ID":"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e","Type":"ContainerStarted","Data":"ec39dd5095ab16f2254c9ac2184a9037a3d58a29d37651d66ca4b356cf47dd42"} Oct 03 12:01:01 crc kubenswrapper[4990]: I1003 12:01:01.762910 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324881-h7424" event={"ID":"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e","Type":"ContainerStarted","Data":"6a6f9d84f0e64de20d8ef68b6dae6452849dc359316b663463f536f4f8c70195"} Oct 03 12:01:01 crc kubenswrapper[4990]: I1003 12:01:01.785789 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29324881-h7424" podStartSLOduration=1.7857707139999999 podStartE2EDuration="1.785770714s" podCreationTimestamp="2025-10-03 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:01:01.783640969 +0000 UTC m=+8243.580272836" watchObservedRunningTime="2025-10-03 12:01:01.785770714 +0000 UTC m=+8243.582402591" Oct 03 12:01:04 crc kubenswrapper[4990]: I1003 12:01:04.796964 4990 generic.go:334] "Generic (PLEG): container finished" podID="b1f19dc4-6a33-415e-ba7d-90d5d8835f0e" containerID="ec39dd5095ab16f2254c9ac2184a9037a3d58a29d37651d66ca4b356cf47dd42" exitCode=0 Oct 03 12:01:04 crc kubenswrapper[4990]: I1003 12:01:04.797032 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324881-h7424" event={"ID":"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e","Type":"ContainerDied","Data":"ec39dd5095ab16f2254c9ac2184a9037a3d58a29d37651d66ca4b356cf47dd42"} Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.249453 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.373090 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-combined-ca-bundle\") pod \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.373346 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-fernet-keys\") pod \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.373413 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fv97\" (UniqueName: \"kubernetes.io/projected/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-kube-api-access-8fv97\") pod \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.373618 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-config-data\") pod \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\" (UID: \"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e\") " Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.382236 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b1f19dc4-6a33-415e-ba7d-90d5d8835f0e" (UID: "b1f19dc4-6a33-415e-ba7d-90d5d8835f0e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.382321 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-kube-api-access-8fv97" (OuterVolumeSpecName: "kube-api-access-8fv97") pod "b1f19dc4-6a33-415e-ba7d-90d5d8835f0e" (UID: "b1f19dc4-6a33-415e-ba7d-90d5d8835f0e"). InnerVolumeSpecName "kube-api-access-8fv97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.408060 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1f19dc4-6a33-415e-ba7d-90d5d8835f0e" (UID: "b1f19dc4-6a33-415e-ba7d-90d5d8835f0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.456619 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-config-data" (OuterVolumeSpecName: "config-data") pod "b1f19dc4-6a33-415e-ba7d-90d5d8835f0e" (UID: "b1f19dc4-6a33-415e-ba7d-90d5d8835f0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.476697 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.476738 4990 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.476752 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fv97\" (UniqueName: \"kubernetes.io/projected/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-kube-api-access-8fv97\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.476768 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f19dc4-6a33-415e-ba7d-90d5d8835f0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.823434 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324881-h7424" event={"ID":"b1f19dc4-6a33-415e-ba7d-90d5d8835f0e","Type":"ContainerDied","Data":"6a6f9d84f0e64de20d8ef68b6dae6452849dc359316b663463f536f4f8c70195"} Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.823959 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a6f9d84f0e64de20d8ef68b6dae6452849dc359316b663463f536f4f8c70195" Oct 03 12:01:06 crc kubenswrapper[4990]: I1003 12:01:06.823598 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324881-h7424" Oct 03 12:01:25 crc kubenswrapper[4990]: I1003 12:01:25.304574 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:01:25 crc kubenswrapper[4990]: I1003 12:01:25.305384 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:01:55 crc kubenswrapper[4990]: I1003 12:01:55.303677 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:01:55 crc kubenswrapper[4990]: I1003 12:01:55.304276 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:01:55 crc kubenswrapper[4990]: I1003 12:01:55.304342 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 12:01:55 crc kubenswrapper[4990]: I1003 12:01:55.305348 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0700b4ff5885627ac6d798ab72cfd8e35ad037ac18f0753b1777583c22342f6d"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:01:55 crc kubenswrapper[4990]: I1003 12:01:55.305488 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://0700b4ff5885627ac6d798ab72cfd8e35ad037ac18f0753b1777583c22342f6d" gracePeriod=600 Oct 03 12:01:55 crc kubenswrapper[4990]: E1003 12:01:55.519049 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21ea38c_26da_4987_a50d_bafecdfbbd02.slice/crio-0700b4ff5885627ac6d798ab72cfd8e35ad037ac18f0753b1777583c22342f6d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21ea38c_26da_4987_a50d_bafecdfbbd02.slice/crio-conmon-0700b4ff5885627ac6d798ab72cfd8e35ad037ac18f0753b1777583c22342f6d.scope\": RecentStats: unable to find data in memory cache]" Oct 03 12:01:56 crc kubenswrapper[4990]: I1003 12:01:56.417556 4990 generic.go:334] "Generic (PLEG): container finished" podID="724cbe0c-9b28-4fc2-acdc-b67f8518acfe" containerID="715706971e6e078fdc6a1c936cdb5eaddbb0d54af3227e043be776f69f8d58a8" exitCode=0 Oct 03 12:01:56 crc kubenswrapper[4990]: I1003 12:01:56.417566 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" event={"ID":"724cbe0c-9b28-4fc2-acdc-b67f8518acfe","Type":"ContainerDied","Data":"715706971e6e078fdc6a1c936cdb5eaddbb0d54af3227e043be776f69f8d58a8"} Oct 03 12:01:56 crc kubenswrapper[4990]: I1003 12:01:56.420611 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="0700b4ff5885627ac6d798ab72cfd8e35ad037ac18f0753b1777583c22342f6d" exitCode=0 Oct 03 12:01:56 crc kubenswrapper[4990]: I1003 12:01:56.420649 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"0700b4ff5885627ac6d798ab72cfd8e35ad037ac18f0753b1777583c22342f6d"} Oct 03 12:01:56 crc kubenswrapper[4990]: I1003 12:01:56.420671 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9"} Oct 03 12:01:56 crc kubenswrapper[4990]: I1003 12:01:56.420689 4990 scope.go:117] "RemoveContainer" containerID="773382ba720e467dc4adbbbd23ceeb5df9b98429a39498f3f3593065143349eb" Oct 03 12:01:57 crc kubenswrapper[4990]: I1003 12:01:57.933808 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.007546 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-combined-ca-bundle\") pod \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.007881 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-inventory\") pod \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.007998 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbmpn\" (UniqueName: \"kubernetes.io/projected/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-kube-api-access-wbmpn\") pod \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.008053 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-ssh-key\") pod \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.008082 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-secret-0\") pod \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\" (UID: \"724cbe0c-9b28-4fc2-acdc-b67f8518acfe\") " Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.012934 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-kube-api-access-wbmpn" (OuterVolumeSpecName: "kube-api-access-wbmpn") pod "724cbe0c-9b28-4fc2-acdc-b67f8518acfe" (UID: "724cbe0c-9b28-4fc2-acdc-b67f8518acfe"). InnerVolumeSpecName "kube-api-access-wbmpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.013173 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "724cbe0c-9b28-4fc2-acdc-b67f8518acfe" (UID: "724cbe0c-9b28-4fc2-acdc-b67f8518acfe"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.035782 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-inventory" (OuterVolumeSpecName: "inventory") pod "724cbe0c-9b28-4fc2-acdc-b67f8518acfe" (UID: "724cbe0c-9b28-4fc2-acdc-b67f8518acfe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.038010 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "724cbe0c-9b28-4fc2-acdc-b67f8518acfe" (UID: "724cbe0c-9b28-4fc2-acdc-b67f8518acfe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.043488 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "724cbe0c-9b28-4fc2-acdc-b67f8518acfe" (UID: "724cbe0c-9b28-4fc2-acdc-b67f8518acfe"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.110159 4990 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.110189 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.110198 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbmpn\" (UniqueName: \"kubernetes.io/projected/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-kube-api-access-wbmpn\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.110207 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.110215 4990 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/724cbe0c-9b28-4fc2-acdc-b67f8518acfe-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.452236 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" event={"ID":"724cbe0c-9b28-4fc2-acdc-b67f8518acfe","Type":"ContainerDied","Data":"31986456a37cde2e8d5a74e30079a63acd1e9a0148dfc73b9bae0fec4e119f3a"} Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.452275 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31986456a37cde2e8d5a74e30079a63acd1e9a0148dfc73b9bae0fec4e119f3a" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.452300 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-zjd6k" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.547817 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-5rbm6"] Oct 03 12:01:58 crc kubenswrapper[4990]: E1003 12:01:58.548540 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724cbe0c-9b28-4fc2-acdc-b67f8518acfe" containerName="libvirt-openstack-openstack-cell1" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.548562 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="724cbe0c-9b28-4fc2-acdc-b67f8518acfe" containerName="libvirt-openstack-openstack-cell1" Oct 03 12:01:58 crc kubenswrapper[4990]: E1003 12:01:58.548605 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f19dc4-6a33-415e-ba7d-90d5d8835f0e" containerName="keystone-cron" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.548616 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f19dc4-6a33-415e-ba7d-90d5d8835f0e" containerName="keystone-cron" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.548928 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f19dc4-6a33-415e-ba7d-90d5d8835f0e" containerName="keystone-cron" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.548979 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="724cbe0c-9b28-4fc2-acdc-b67f8518acfe" containerName="libvirt-openstack-openstack-cell1" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.550163 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.556820 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.557236 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.557939 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.558130 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.558546 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.560810 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-5rbm6"] Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.560840 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.564130 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.621654 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-inventory\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.621724 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.621810 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4tx\" (UniqueName: \"kubernetes.io/projected/e96962be-8334-4bd3-af96-db55ea084ef2-kube-api-access-cr4tx\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.621893 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.621948 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.621979 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.622045 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.622076 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.622143 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.724556 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr4tx\" (UniqueName: \"kubernetes.io/projected/e96962be-8334-4bd3-af96-db55ea084ef2-kube-api-access-cr4tx\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.724662 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.724716 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.724759 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.724819 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.725432 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.725859 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.725922 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-inventory\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.725956 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.727825 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.730029 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.730401 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.730593 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.731402 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.731993 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.732398 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.733063 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-inventory\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.752481 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr4tx\" (UniqueName: \"kubernetes.io/projected/e96962be-8334-4bd3-af96-db55ea084ef2-kube-api-access-cr4tx\") pod \"nova-cell1-openstack-openstack-cell1-5rbm6\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:58 crc kubenswrapper[4990]: I1003 12:01:58.881618 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:01:59 crc kubenswrapper[4990]: W1003 12:01:59.493608 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode96962be_8334_4bd3_af96_db55ea084ef2.slice/crio-e406d7cfea3d4b83c4bcf8254cfe9acd5c3dff3c9ec1000ef966f87f5f14e61f WatchSource:0}: Error finding container e406d7cfea3d4b83c4bcf8254cfe9acd5c3dff3c9ec1000ef966f87f5f14e61f: Status 404 returned error can't find the container with id e406d7cfea3d4b83c4bcf8254cfe9acd5c3dff3c9ec1000ef966f87f5f14e61f Oct 03 12:01:59 crc kubenswrapper[4990]: I1003 12:01:59.494916 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-5rbm6"] Oct 03 12:02:00 crc kubenswrapper[4990]: I1003 12:02:00.476341 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" event={"ID":"e96962be-8334-4bd3-af96-db55ea084ef2","Type":"ContainerStarted","Data":"7e66a6e91d9b44c7462075208b093ca2cdc0196cfbe810faf514d8da49d2dded"} Oct 03 12:02:00 crc kubenswrapper[4990]: I1003 12:02:00.476842 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" event={"ID":"e96962be-8334-4bd3-af96-db55ea084ef2","Type":"ContainerStarted","Data":"e406d7cfea3d4b83c4bcf8254cfe9acd5c3dff3c9ec1000ef966f87f5f14e61f"} Oct 03 12:02:00 crc kubenswrapper[4990]: I1003 12:02:00.506433 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" podStartSLOduration=1.885381174 podStartE2EDuration="2.506408024s" podCreationTimestamp="2025-10-03 12:01:58 +0000 UTC" firstStartedPulling="2025-10-03 12:01:59.497609719 +0000 UTC m=+8301.294241586" lastFinishedPulling="2025-10-03 12:02:00.118636579 +0000 UTC m=+8301.915268436" observedRunningTime="2025-10-03 12:02:00.496759175 +0000 UTC m=+8302.293391042" watchObservedRunningTime="2025-10-03 12:02:00.506408024 +0000 UTC m=+8302.303039901" Oct 03 12:03:55 crc kubenswrapper[4990]: I1003 12:03:55.303966 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:03:55 crc kubenswrapper[4990]: I1003 12:03:55.304781 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:04:25 crc kubenswrapper[4990]: I1003 12:04:25.303821 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:04:25 crc kubenswrapper[4990]: I1003 12:04:25.304456 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.304041 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.304680 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.304742 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.305628 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.305728 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" gracePeriod=600 Oct 03 12:04:55 crc kubenswrapper[4990]: E1003 12:04:55.437268 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.503962 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" exitCode=0 Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.504014 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9"} Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.504051 4990 scope.go:117] "RemoveContainer" containerID="0700b4ff5885627ac6d798ab72cfd8e35ad037ac18f0753b1777583c22342f6d" Oct 03 12:04:55 crc kubenswrapper[4990]: I1003 12:04:55.505003 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:04:55 crc kubenswrapper[4990]: E1003 12:04:55.505460 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:05:07 crc kubenswrapper[4990]: I1003 12:05:07.873257 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:05:07 crc kubenswrapper[4990]: E1003 12:05:07.874340 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:05:21 crc kubenswrapper[4990]: I1003 12:05:21.873001 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:05:21 crc kubenswrapper[4990]: E1003 12:05:21.874134 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:05:33 crc kubenswrapper[4990]: I1003 12:05:33.872493 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:05:33 crc kubenswrapper[4990]: E1003 12:05:33.873498 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:05:44 crc kubenswrapper[4990]: I1003 12:05:44.873048 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:05:44 crc kubenswrapper[4990]: E1003 12:05:44.873970 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:05:50 crc kubenswrapper[4990]: I1003 12:05:50.188263 4990 generic.go:334] "Generic (PLEG): container finished" podID="e96962be-8334-4bd3-af96-db55ea084ef2" containerID="7e66a6e91d9b44c7462075208b093ca2cdc0196cfbe810faf514d8da49d2dded" exitCode=0 Oct 03 12:05:50 crc kubenswrapper[4990]: I1003 12:05:50.188337 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" event={"ID":"e96962be-8334-4bd3-af96-db55ea084ef2","Type":"ContainerDied","Data":"7e66a6e91d9b44c7462075208b093ca2cdc0196cfbe810faf514d8da49d2dded"} Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.664624 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.853795 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cells-global-config-0\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.854161 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-1\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.854281 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-0\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.854343 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-1\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.854759 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-combined-ca-bundle\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.854826 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-inventory\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.855023 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-ssh-key\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.855070 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-0\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.855135 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr4tx\" (UniqueName: \"kubernetes.io/projected/e96962be-8334-4bd3-af96-db55ea084ef2-kube-api-access-cr4tx\") pod \"e96962be-8334-4bd3-af96-db55ea084ef2\" (UID: \"e96962be-8334-4bd3-af96-db55ea084ef2\") " Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.860610 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96962be-8334-4bd3-af96-db55ea084ef2-kube-api-access-cr4tx" (OuterVolumeSpecName: "kube-api-access-cr4tx") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "kube-api-access-cr4tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.871713 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.892430 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.892543 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.898709 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-inventory" (OuterVolumeSpecName: "inventory") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.907401 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.915003 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.918017 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.920275 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e96962be-8334-4bd3-af96-db55ea084ef2" (UID: "e96962be-8334-4bd3-af96-db55ea084ef2"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958588 4990 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958631 4990 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958649 4990 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958687 4990 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958704 4990 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958724 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958745 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958763 4990 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e96962be-8334-4bd3-af96-db55ea084ef2-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:51 crc kubenswrapper[4990]: I1003 12:05:51.958780 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr4tx\" (UniqueName: \"kubernetes.io/projected/e96962be-8334-4bd3-af96-db55ea084ef2-kube-api-access-cr4tx\") on node \"crc\" DevicePath \"\"" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.213599 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" event={"ID":"e96962be-8334-4bd3-af96-db55ea084ef2","Type":"ContainerDied","Data":"e406d7cfea3d4b83c4bcf8254cfe9acd5c3dff3c9ec1000ef966f87f5f14e61f"} Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.213637 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-5rbm6" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.213655 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e406d7cfea3d4b83c4bcf8254cfe9acd5c3dff3c9ec1000ef966f87f5f14e61f" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.336649 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-ssglj"] Oct 03 12:05:52 crc kubenswrapper[4990]: E1003 12:05:52.337579 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96962be-8334-4bd3-af96-db55ea084ef2" containerName="nova-cell1-openstack-openstack-cell1" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.337699 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96962be-8334-4bd3-af96-db55ea084ef2" containerName="nova-cell1-openstack-openstack-cell1" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.338121 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96962be-8334-4bd3-af96-db55ea084ef2" containerName="nova-cell1-openstack-openstack-cell1" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.339247 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.345721 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.346050 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.346197 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.346348 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.346598 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.348404 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-ssglj"] Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.472643 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.472713 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.472809 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.472843 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-inventory\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.472892 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ssh-key\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.472918 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.472940 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnc8s\" (UniqueName: \"kubernetes.io/projected/1d028b25-53ad-418d-a47b-3bee28cdd317-kube-api-access-gnc8s\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.575298 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-inventory\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.575650 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ssh-key\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.575689 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.575733 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnc8s\" (UniqueName: \"kubernetes.io/projected/1d028b25-53ad-418d-a47b-3bee28cdd317-kube-api-access-gnc8s\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.575797 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.575848 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.575917 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.579266 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.579364 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.579502 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-inventory\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.580316 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ssh-key\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.580867 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.581951 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.597076 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnc8s\" (UniqueName: \"kubernetes.io/projected/1d028b25-53ad-418d-a47b-3bee28cdd317-kube-api-access-gnc8s\") pod \"telemetry-openstack-openstack-cell1-ssglj\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:52 crc kubenswrapper[4990]: I1003 12:05:52.674312 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:05:53 crc kubenswrapper[4990]: I1003 12:05:53.244992 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-ssglj"] Oct 03 12:05:53 crc kubenswrapper[4990]: I1003 12:05:53.249139 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 12:05:54 crc kubenswrapper[4990]: I1003 12:05:54.232550 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" event={"ID":"1d028b25-53ad-418d-a47b-3bee28cdd317","Type":"ContainerStarted","Data":"76ca963628001db0b10543c0962ca0a93cc5be6469f4d0ec8f31fc266749c3fd"} Oct 03 12:05:54 crc kubenswrapper[4990]: I1003 12:05:54.232863 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" event={"ID":"1d028b25-53ad-418d-a47b-3bee28cdd317","Type":"ContainerStarted","Data":"3dd67e8eea3941925b9aa49646976f7284c31be2e4d0016798d610e0e7ec5cf5"} Oct 03 12:05:54 crc kubenswrapper[4990]: I1003 12:05:54.275865 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" podStartSLOduration=1.667986617 podStartE2EDuration="2.275834326s" podCreationTimestamp="2025-10-03 12:05:52 +0000 UTC" firstStartedPulling="2025-10-03 12:05:53.248874452 +0000 UTC m=+8535.045506309" lastFinishedPulling="2025-10-03 12:05:53.856722161 +0000 UTC m=+8535.653354018" observedRunningTime="2025-10-03 12:05:54.255779348 +0000 UTC m=+8536.052411265" watchObservedRunningTime="2025-10-03 12:05:54.275834326 +0000 UTC m=+8536.072466223" Oct 03 12:05:57 crc kubenswrapper[4990]: I1003 12:05:57.871929 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:05:57 crc kubenswrapper[4990]: E1003 12:05:57.872660 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:06:12 crc kubenswrapper[4990]: I1003 12:06:12.873443 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:06:12 crc kubenswrapper[4990]: E1003 12:06:12.874976 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:06:27 crc kubenswrapper[4990]: I1003 12:06:27.872744 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:06:27 crc kubenswrapper[4990]: E1003 12:06:27.874709 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:06:39 crc kubenswrapper[4990]: I1003 12:06:39.872555 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:06:39 crc kubenswrapper[4990]: E1003 12:06:39.873390 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:06:53 crc kubenswrapper[4990]: I1003 12:06:53.872948 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:06:53 crc kubenswrapper[4990]: E1003 12:06:53.874014 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:07:06 crc kubenswrapper[4990]: I1003 12:07:06.872263 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:07:06 crc kubenswrapper[4990]: E1003 12:07:06.873115 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:07:18 crc kubenswrapper[4990]: I1003 12:07:18.886536 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:07:18 crc kubenswrapper[4990]: E1003 12:07:18.887646 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:07:30 crc kubenswrapper[4990]: I1003 12:07:30.885833 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:07:30 crc kubenswrapper[4990]: E1003 12:07:30.887379 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.500097 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b2842"] Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.504695 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.523842 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2842"] Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.618275 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-catalog-content\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.618356 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-utilities\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.618440 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9898\" (UniqueName: \"kubernetes.io/projected/21db1c1e-7909-47e6-9d5f-bff37456651e-kube-api-access-j9898\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.720862 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-catalog-content\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.720939 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-utilities\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.720997 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9898\" (UniqueName: \"kubernetes.io/projected/21db1c1e-7909-47e6-9d5f-bff37456651e-kube-api-access-j9898\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.721741 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-catalog-content\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.721778 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-utilities\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.744803 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9898\" (UniqueName: \"kubernetes.io/projected/21db1c1e-7909-47e6-9d5f-bff37456651e-kube-api-access-j9898\") pod \"community-operators-b2842\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:34 crc kubenswrapper[4990]: I1003 12:07:34.874454 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:35 crc kubenswrapper[4990]: I1003 12:07:35.436836 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2842"] Oct 03 12:07:35 crc kubenswrapper[4990]: I1003 12:07:35.460547 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2842" event={"ID":"21db1c1e-7909-47e6-9d5f-bff37456651e","Type":"ContainerStarted","Data":"28e296743c2eac8a6659aff5d9e0cdc385a94e507a46b62752fd6703bc1ac5f8"} Oct 03 12:07:36 crc kubenswrapper[4990]: I1003 12:07:36.475287 4990 generic.go:334] "Generic (PLEG): container finished" podID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerID="e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a" exitCode=0 Oct 03 12:07:36 crc kubenswrapper[4990]: I1003 12:07:36.475367 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2842" event={"ID":"21db1c1e-7909-47e6-9d5f-bff37456651e","Type":"ContainerDied","Data":"e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a"} Oct 03 12:07:38 crc kubenswrapper[4990]: I1003 12:07:38.504668 4990 generic.go:334] "Generic (PLEG): container finished" podID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerID="7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69" exitCode=0 Oct 03 12:07:38 crc kubenswrapper[4990]: I1003 12:07:38.504765 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2842" event={"ID":"21db1c1e-7909-47e6-9d5f-bff37456651e","Type":"ContainerDied","Data":"7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69"} Oct 03 12:07:39 crc kubenswrapper[4990]: I1003 12:07:39.526203 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2842" event={"ID":"21db1c1e-7909-47e6-9d5f-bff37456651e","Type":"ContainerStarted","Data":"6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f"} Oct 03 12:07:39 crc kubenswrapper[4990]: I1003 12:07:39.563331 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b2842" podStartSLOduration=3.025021901 podStartE2EDuration="5.563308979s" podCreationTimestamp="2025-10-03 12:07:34 +0000 UTC" firstStartedPulling="2025-10-03 12:07:36.477433519 +0000 UTC m=+8638.274065376" lastFinishedPulling="2025-10-03 12:07:39.015720567 +0000 UTC m=+8640.812352454" observedRunningTime="2025-10-03 12:07:39.551568126 +0000 UTC m=+8641.348200023" watchObservedRunningTime="2025-10-03 12:07:39.563308979 +0000 UTC m=+8641.359940836" Oct 03 12:07:44 crc kubenswrapper[4990]: I1003 12:07:44.895373 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:44 crc kubenswrapper[4990]: I1003 12:07:44.898798 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:44 crc kubenswrapper[4990]: I1003 12:07:44.973307 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:45 crc kubenswrapper[4990]: I1003 12:07:45.700450 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:45 crc kubenswrapper[4990]: I1003 12:07:45.755390 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2842"] Oct 03 12:07:45 crc kubenswrapper[4990]: I1003 12:07:45.871862 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:07:45 crc kubenswrapper[4990]: E1003 12:07:45.872440 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:07:47 crc kubenswrapper[4990]: I1003 12:07:47.627699 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b2842" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerName="registry-server" containerID="cri-o://6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f" gracePeriod=2 Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.177486 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.250707 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-utilities\") pod \"21db1c1e-7909-47e6-9d5f-bff37456651e\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.251115 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-catalog-content\") pod \"21db1c1e-7909-47e6-9d5f-bff37456651e\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.251198 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9898\" (UniqueName: \"kubernetes.io/projected/21db1c1e-7909-47e6-9d5f-bff37456651e-kube-api-access-j9898\") pod \"21db1c1e-7909-47e6-9d5f-bff37456651e\" (UID: \"21db1c1e-7909-47e6-9d5f-bff37456651e\") " Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.251886 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-utilities" (OuterVolumeSpecName: "utilities") pod "21db1c1e-7909-47e6-9d5f-bff37456651e" (UID: "21db1c1e-7909-47e6-9d5f-bff37456651e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.260294 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21db1c1e-7909-47e6-9d5f-bff37456651e-kube-api-access-j9898" (OuterVolumeSpecName: "kube-api-access-j9898") pod "21db1c1e-7909-47e6-9d5f-bff37456651e" (UID: "21db1c1e-7909-47e6-9d5f-bff37456651e"). InnerVolumeSpecName "kube-api-access-j9898". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.306774 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21db1c1e-7909-47e6-9d5f-bff37456651e" (UID: "21db1c1e-7909-47e6-9d5f-bff37456651e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.353873 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.354214 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9898\" (UniqueName: \"kubernetes.io/projected/21db1c1e-7909-47e6-9d5f-bff37456651e-kube-api-access-j9898\") on node \"crc\" DevicePath \"\"" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.354314 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21db1c1e-7909-47e6-9d5f-bff37456651e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.639728 4990 generic.go:334] "Generic (PLEG): container finished" podID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerID="6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f" exitCode=0 Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.639788 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2842" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.639792 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2842" event={"ID":"21db1c1e-7909-47e6-9d5f-bff37456651e","Type":"ContainerDied","Data":"6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f"} Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.639943 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2842" event={"ID":"21db1c1e-7909-47e6-9d5f-bff37456651e","Type":"ContainerDied","Data":"28e296743c2eac8a6659aff5d9e0cdc385a94e507a46b62752fd6703bc1ac5f8"} Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.639969 4990 scope.go:117] "RemoveContainer" containerID="6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.671265 4990 scope.go:117] "RemoveContainer" containerID="7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.678018 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2842"] Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.692579 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b2842"] Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.704772 4990 scope.go:117] "RemoveContainer" containerID="e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.758809 4990 scope.go:117] "RemoveContainer" containerID="6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f" Oct 03 12:07:48 crc kubenswrapper[4990]: E1003 12:07:48.759283 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f\": container with ID starting with 6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f not found: ID does not exist" containerID="6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.759338 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f"} err="failed to get container status \"6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f\": rpc error: code = NotFound desc = could not find container \"6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f\": container with ID starting with 6769bfa052f7c1146f5f74e28e20e992710428d5f2af76d639764e8266b3542f not found: ID does not exist" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.759383 4990 scope.go:117] "RemoveContainer" containerID="7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69" Oct 03 12:07:48 crc kubenswrapper[4990]: E1003 12:07:48.759830 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69\": container with ID starting with 7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69 not found: ID does not exist" containerID="7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.759863 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69"} err="failed to get container status \"7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69\": rpc error: code = NotFound desc = could not find container \"7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69\": container with ID starting with 7aa4f823ba8816ba86da32a34d856e58324b81a972e703704bd6766d4381be69 not found: ID does not exist" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.759883 4990 scope.go:117] "RemoveContainer" containerID="e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a" Oct 03 12:07:48 crc kubenswrapper[4990]: E1003 12:07:48.760119 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a\": container with ID starting with e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a not found: ID does not exist" containerID="e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.760139 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a"} err="failed to get container status \"e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a\": rpc error: code = NotFound desc = could not find container \"e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a\": container with ID starting with e1e6d3b2df61ac98da047676e14ce3f196957a1ffbfa5fcdb2a69028de5af92a not found: ID does not exist" Oct 03 12:07:48 crc kubenswrapper[4990]: I1003 12:07:48.890014 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" path="/var/lib/kubelet/pods/21db1c1e-7909-47e6-9d5f-bff37456651e/volumes" Oct 03 12:07:59 crc kubenswrapper[4990]: I1003 12:07:59.871542 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:07:59 crc kubenswrapper[4990]: E1003 12:07:59.872358 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:08:14 crc kubenswrapper[4990]: I1003 12:08:14.872761 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:08:14 crc kubenswrapper[4990]: E1003 12:08:14.874036 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:08:28 crc kubenswrapper[4990]: I1003 12:08:28.884078 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:08:28 crc kubenswrapper[4990]: E1003 12:08:28.885460 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:08:43 crc kubenswrapper[4990]: I1003 12:08:43.872655 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:08:43 crc kubenswrapper[4990]: E1003 12:08:43.873544 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.410899 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-248xn"] Oct 03 12:08:52 crc kubenswrapper[4990]: E1003 12:08:52.413038 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerName="extract-content" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.413151 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerName="extract-content" Oct 03 12:08:52 crc kubenswrapper[4990]: E1003 12:08:52.413239 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerName="extract-utilities" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.413314 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerName="extract-utilities" Oct 03 12:08:52 crc kubenswrapper[4990]: E1003 12:08:52.413407 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerName="registry-server" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.413485 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerName="registry-server" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.413863 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="21db1c1e-7909-47e6-9d5f-bff37456651e" containerName="registry-server" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.415895 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.439742 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-248xn"] Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.515433 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhwh\" (UniqueName: \"kubernetes.io/projected/85f7d891-a3b2-4441-9f91-5f3fbf95912a-kube-api-access-xkhwh\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.515817 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-utilities\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.515997 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-catalog-content\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.618199 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-catalog-content\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.618423 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhwh\" (UniqueName: \"kubernetes.io/projected/85f7d891-a3b2-4441-9f91-5f3fbf95912a-kube-api-access-xkhwh\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.618560 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-utilities\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.619035 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-catalog-content\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.619150 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-utilities\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.641286 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhwh\" (UniqueName: \"kubernetes.io/projected/85f7d891-a3b2-4441-9f91-5f3fbf95912a-kube-api-access-xkhwh\") pod \"certified-operators-248xn\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:52 crc kubenswrapper[4990]: I1003 12:08:52.753833 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:08:53 crc kubenswrapper[4990]: I1003 12:08:53.298961 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-248xn"] Oct 03 12:08:53 crc kubenswrapper[4990]: I1003 12:08:53.467889 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-248xn" event={"ID":"85f7d891-a3b2-4441-9f91-5f3fbf95912a","Type":"ContainerStarted","Data":"e921fc155a7818606fb9f4f9c77a83d0a5731e6505a557ed37cb26e5ab9467e1"} Oct 03 12:08:54 crc kubenswrapper[4990]: I1003 12:08:54.482658 4990 generic.go:334] "Generic (PLEG): container finished" podID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerID="92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc" exitCode=0 Oct 03 12:08:54 crc kubenswrapper[4990]: I1003 12:08:54.482783 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-248xn" event={"ID":"85f7d891-a3b2-4441-9f91-5f3fbf95912a","Type":"ContainerDied","Data":"92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc"} Oct 03 12:08:55 crc kubenswrapper[4990]: I1003 12:08:55.531807 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-248xn" event={"ID":"85f7d891-a3b2-4441-9f91-5f3fbf95912a","Type":"ContainerStarted","Data":"5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7"} Oct 03 12:08:55 crc kubenswrapper[4990]: I1003 12:08:55.872306 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:08:55 crc kubenswrapper[4990]: E1003 12:08:55.872742 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:08:56 crc kubenswrapper[4990]: I1003 12:08:56.543589 4990 generic.go:334] "Generic (PLEG): container finished" podID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerID="5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7" exitCode=0 Oct 03 12:08:56 crc kubenswrapper[4990]: I1003 12:08:56.543728 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-248xn" event={"ID":"85f7d891-a3b2-4441-9f91-5f3fbf95912a","Type":"ContainerDied","Data":"5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7"} Oct 03 12:08:58 crc kubenswrapper[4990]: I1003 12:08:58.571973 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-248xn" event={"ID":"85f7d891-a3b2-4441-9f91-5f3fbf95912a","Type":"ContainerStarted","Data":"6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8"} Oct 03 12:08:58 crc kubenswrapper[4990]: I1003 12:08:58.597087 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-248xn" podStartSLOduration=3.846743402 podStartE2EDuration="6.597068305s" podCreationTimestamp="2025-10-03 12:08:52 +0000 UTC" firstStartedPulling="2025-10-03 12:08:54.485239828 +0000 UTC m=+8716.281871715" lastFinishedPulling="2025-10-03 12:08:57.235564761 +0000 UTC m=+8719.032196618" observedRunningTime="2025-10-03 12:08:58.588343749 +0000 UTC m=+8720.384975616" watchObservedRunningTime="2025-10-03 12:08:58.597068305 +0000 UTC m=+8720.393700162" Oct 03 12:09:02 crc kubenswrapper[4990]: I1003 12:09:02.754035 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:09:02 crc kubenswrapper[4990]: I1003 12:09:02.754491 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:09:02 crc kubenswrapper[4990]: I1003 12:09:02.829615 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:09:03 crc kubenswrapper[4990]: I1003 12:09:03.682379 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:09:05 crc kubenswrapper[4990]: I1003 12:09:05.799810 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-248xn"] Oct 03 12:09:05 crc kubenswrapper[4990]: I1003 12:09:05.800484 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-248xn" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerName="registry-server" containerID="cri-o://6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8" gracePeriod=2 Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.404582 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.591027 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-catalog-content\") pod \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.591140 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkhwh\" (UniqueName: \"kubernetes.io/projected/85f7d891-a3b2-4441-9f91-5f3fbf95912a-kube-api-access-xkhwh\") pod \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.591270 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-utilities\") pod \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\" (UID: \"85f7d891-a3b2-4441-9f91-5f3fbf95912a\") " Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.592679 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-utilities" (OuterVolumeSpecName: "utilities") pod "85f7d891-a3b2-4441-9f91-5f3fbf95912a" (UID: "85f7d891-a3b2-4441-9f91-5f3fbf95912a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.610858 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f7d891-a3b2-4441-9f91-5f3fbf95912a-kube-api-access-xkhwh" (OuterVolumeSpecName: "kube-api-access-xkhwh") pod "85f7d891-a3b2-4441-9f91-5f3fbf95912a" (UID: "85f7d891-a3b2-4441-9f91-5f3fbf95912a"). InnerVolumeSpecName "kube-api-access-xkhwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.654707 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85f7d891-a3b2-4441-9f91-5f3fbf95912a" (UID: "85f7d891-a3b2-4441-9f91-5f3fbf95912a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.672123 4990 generic.go:334] "Generic (PLEG): container finished" podID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerID="6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8" exitCode=0 Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.672168 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-248xn" event={"ID":"85f7d891-a3b2-4441-9f91-5f3fbf95912a","Type":"ContainerDied","Data":"6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8"} Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.672201 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-248xn" event={"ID":"85f7d891-a3b2-4441-9f91-5f3fbf95912a","Type":"ContainerDied","Data":"e921fc155a7818606fb9f4f9c77a83d0a5731e6505a557ed37cb26e5ab9467e1"} Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.672216 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-248xn" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.672226 4990 scope.go:117] "RemoveContainer" containerID="6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.698284 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.698342 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkhwh\" (UniqueName: \"kubernetes.io/projected/85f7d891-a3b2-4441-9f91-5f3fbf95912a-kube-api-access-xkhwh\") on node \"crc\" DevicePath \"\"" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.698358 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85f7d891-a3b2-4441-9f91-5f3fbf95912a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.711231 4990 scope.go:117] "RemoveContainer" containerID="5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.711420 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-248xn"] Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.723665 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-248xn"] Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.743121 4990 scope.go:117] "RemoveContainer" containerID="92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.797535 4990 scope.go:117] "RemoveContainer" containerID="6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8" Oct 03 12:09:06 crc kubenswrapper[4990]: E1003 12:09:06.798284 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8\": container with ID starting with 6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8 not found: ID does not exist" containerID="6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.798328 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8"} err="failed to get container status \"6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8\": rpc error: code = NotFound desc = could not find container \"6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8\": container with ID starting with 6dc288233d0ad5061cf568e9ada769762d42ead432e7844a07b6a870b67e11a8 not found: ID does not exist" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.798355 4990 scope.go:117] "RemoveContainer" containerID="5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7" Oct 03 12:09:06 crc kubenswrapper[4990]: E1003 12:09:06.800302 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7\": container with ID starting with 5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7 not found: ID does not exist" containerID="5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.800373 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7"} err="failed to get container status \"5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7\": rpc error: code = NotFound desc = could not find container \"5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7\": container with ID starting with 5a52e41fa7b431db6b0c80c7920c6c25c479a284bdc48c075c8b109f775e32c7 not found: ID does not exist" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.800422 4990 scope.go:117] "RemoveContainer" containerID="92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc" Oct 03 12:09:06 crc kubenswrapper[4990]: E1003 12:09:06.800890 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc\": container with ID starting with 92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc not found: ID does not exist" containerID="92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.800922 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc"} err="failed to get container status \"92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc\": rpc error: code = NotFound desc = could not find container \"92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc\": container with ID starting with 92d1810d618782c09fd5dac753b783d397fa4ace936102bb1cd10e39334f68bc not found: ID does not exist" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.872762 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:09:06 crc kubenswrapper[4990]: E1003 12:09:06.873152 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:09:06 crc kubenswrapper[4990]: I1003 12:09:06.889573 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" path="/var/lib/kubelet/pods/85f7d891-a3b2-4441-9f91-5f3fbf95912a/volumes" Oct 03 12:09:19 crc kubenswrapper[4990]: I1003 12:09:19.872192 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:09:19 crc kubenswrapper[4990]: E1003 12:09:19.873343 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:09:32 crc kubenswrapper[4990]: I1003 12:09:32.872395 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:09:32 crc kubenswrapper[4990]: E1003 12:09:32.873461 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:09:44 crc kubenswrapper[4990]: I1003 12:09:44.872274 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:09:44 crc kubenswrapper[4990]: E1003 12:09:44.873175 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:09:55 crc kubenswrapper[4990]: I1003 12:09:55.872072 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:09:56 crc kubenswrapper[4990]: I1003 12:09:56.327193 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"45a8888ec585ce51f377ce4c6e5d609c4d53f7840ec936fba419409c317886fc"} Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.823537 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j7dz2"] Oct 03 12:10:43 crc kubenswrapper[4990]: E1003 12:10:43.825205 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerName="extract-utilities" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.825276 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerName="extract-utilities" Oct 03 12:10:43 crc kubenswrapper[4990]: E1003 12:10:43.825355 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerName="extract-content" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.825435 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerName="extract-content" Oct 03 12:10:43 crc kubenswrapper[4990]: E1003 12:10:43.825552 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerName="registry-server" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.825614 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerName="registry-server" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.825863 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f7d891-a3b2-4441-9f91-5f3fbf95912a" containerName="registry-server" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.827440 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.837521 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7dz2"] Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.873191 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqxgd\" (UniqueName: \"kubernetes.io/projected/51db8203-0744-43df-b04a-997ca66731b3-kube-api-access-qqxgd\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.873238 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-utilities\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.873271 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-catalog-content\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.975566 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqxgd\" (UniqueName: \"kubernetes.io/projected/51db8203-0744-43df-b04a-997ca66731b3-kube-api-access-qqxgd\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.975641 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-utilities\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.975691 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-catalog-content\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.976253 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-utilities\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.976364 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-catalog-content\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:43 crc kubenswrapper[4990]: I1003 12:10:43.997674 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqxgd\" (UniqueName: \"kubernetes.io/projected/51db8203-0744-43df-b04a-997ca66731b3-kube-api-access-qqxgd\") pod \"redhat-marketplace-j7dz2\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:44 crc kubenswrapper[4990]: I1003 12:10:44.167646 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:44 crc kubenswrapper[4990]: I1003 12:10:44.670600 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7dz2"] Oct 03 12:10:44 crc kubenswrapper[4990]: I1003 12:10:44.926272 4990 generic.go:334] "Generic (PLEG): container finished" podID="51db8203-0744-43df-b04a-997ca66731b3" containerID="04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492" exitCode=0 Oct 03 12:10:44 crc kubenswrapper[4990]: I1003 12:10:44.926450 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7dz2" event={"ID":"51db8203-0744-43df-b04a-997ca66731b3","Type":"ContainerDied","Data":"04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492"} Oct 03 12:10:44 crc kubenswrapper[4990]: I1003 12:10:44.926787 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7dz2" event={"ID":"51db8203-0744-43df-b04a-997ca66731b3","Type":"ContainerStarted","Data":"9453ba86382bbbf68905185572992af6ee2baa184a03b2abea7a19d3f24a4714"} Oct 03 12:10:46 crc kubenswrapper[4990]: I1003 12:10:46.949058 4990 generic.go:334] "Generic (PLEG): container finished" podID="51db8203-0744-43df-b04a-997ca66731b3" containerID="969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a" exitCode=0 Oct 03 12:10:46 crc kubenswrapper[4990]: I1003 12:10:46.949118 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7dz2" event={"ID":"51db8203-0744-43df-b04a-997ca66731b3","Type":"ContainerDied","Data":"969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a"} Oct 03 12:10:47 crc kubenswrapper[4990]: I1003 12:10:47.968568 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7dz2" event={"ID":"51db8203-0744-43df-b04a-997ca66731b3","Type":"ContainerStarted","Data":"47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab"} Oct 03 12:10:47 crc kubenswrapper[4990]: I1003 12:10:47.999361 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j7dz2" podStartSLOduration=2.534322447 podStartE2EDuration="4.999333971s" podCreationTimestamp="2025-10-03 12:10:43 +0000 UTC" firstStartedPulling="2025-10-03 12:10:44.929605608 +0000 UTC m=+8826.726237475" lastFinishedPulling="2025-10-03 12:10:47.394617142 +0000 UTC m=+8829.191248999" observedRunningTime="2025-10-03 12:10:47.991304083 +0000 UTC m=+8829.787935960" watchObservedRunningTime="2025-10-03 12:10:47.999333971 +0000 UTC m=+8829.795965838" Oct 03 12:10:53 crc kubenswrapper[4990]: I1003 12:10:53.054366 4990 generic.go:334] "Generic (PLEG): container finished" podID="1d028b25-53ad-418d-a47b-3bee28cdd317" containerID="76ca963628001db0b10543c0962ca0a93cc5be6469f4d0ec8f31fc266749c3fd" exitCode=0 Oct 03 12:10:53 crc kubenswrapper[4990]: I1003 12:10:53.055095 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" event={"ID":"1d028b25-53ad-418d-a47b-3bee28cdd317","Type":"ContainerDied","Data":"76ca963628001db0b10543c0962ca0a93cc5be6469f4d0ec8f31fc266749c3fd"} Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.168158 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.168223 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.224683 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.551645 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.665923 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnc8s\" (UniqueName: \"kubernetes.io/projected/1d028b25-53ad-418d-a47b-3bee28cdd317-kube-api-access-gnc8s\") pod \"1d028b25-53ad-418d-a47b-3bee28cdd317\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.666105 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-1\") pod \"1d028b25-53ad-418d-a47b-3bee28cdd317\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.666157 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-inventory\") pod \"1d028b25-53ad-418d-a47b-3bee28cdd317\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.666184 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-2\") pod \"1d028b25-53ad-418d-a47b-3bee28cdd317\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.666274 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-telemetry-combined-ca-bundle\") pod \"1d028b25-53ad-418d-a47b-3bee28cdd317\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.666331 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-0\") pod \"1d028b25-53ad-418d-a47b-3bee28cdd317\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.666358 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ssh-key\") pod \"1d028b25-53ad-418d-a47b-3bee28cdd317\" (UID: \"1d028b25-53ad-418d-a47b-3bee28cdd317\") " Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.678385 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1d028b25-53ad-418d-a47b-3bee28cdd317" (UID: "1d028b25-53ad-418d-a47b-3bee28cdd317"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.678430 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d028b25-53ad-418d-a47b-3bee28cdd317-kube-api-access-gnc8s" (OuterVolumeSpecName: "kube-api-access-gnc8s") pod "1d028b25-53ad-418d-a47b-3bee28cdd317" (UID: "1d028b25-53ad-418d-a47b-3bee28cdd317"). InnerVolumeSpecName "kube-api-access-gnc8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.701365 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "1d028b25-53ad-418d-a47b-3bee28cdd317" (UID: "1d028b25-53ad-418d-a47b-3bee28cdd317"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.704822 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "1d028b25-53ad-418d-a47b-3bee28cdd317" (UID: "1d028b25-53ad-418d-a47b-3bee28cdd317"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.734258 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-inventory" (OuterVolumeSpecName: "inventory") pod "1d028b25-53ad-418d-a47b-3bee28cdd317" (UID: "1d028b25-53ad-418d-a47b-3bee28cdd317"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.761953 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "1d028b25-53ad-418d-a47b-3bee28cdd317" (UID: "1d028b25-53ad-418d-a47b-3bee28cdd317"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.763244 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1d028b25-53ad-418d-a47b-3bee28cdd317" (UID: "1d028b25-53ad-418d-a47b-3bee28cdd317"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.769649 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnc8s\" (UniqueName: \"kubernetes.io/projected/1d028b25-53ad-418d-a47b-3bee28cdd317-kube-api-access-gnc8s\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.769674 4990 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.769705 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.769715 4990 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.769724 4990 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.769733 4990 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:54 crc kubenswrapper[4990]: I1003 12:10:54.769742 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1d028b25-53ad-418d-a47b-3bee28cdd317-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.084717 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" event={"ID":"1d028b25-53ad-418d-a47b-3bee28cdd317","Type":"ContainerDied","Data":"3dd67e8eea3941925b9aa49646976f7284c31be2e4d0016798d610e0e7ec5cf5"} Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.084786 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dd67e8eea3941925b9aa49646976f7284c31be2e4d0016798d610e0e7ec5cf5" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.084786 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-ssglj" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.169341 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.244325 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-rj68z"] Oct 03 12:10:55 crc kubenswrapper[4990]: E1003 12:10:55.245024 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d028b25-53ad-418d-a47b-3bee28cdd317" containerName="telemetry-openstack-openstack-cell1" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.247687 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d028b25-53ad-418d-a47b-3bee28cdd317" containerName="telemetry-openstack-openstack-cell1" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.248315 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d028b25-53ad-418d-a47b-3bee28cdd317" containerName="telemetry-openstack-openstack-cell1" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.249537 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-rj68z"] Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.250862 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.255109 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.255226 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.255374 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.255446 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.256120 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.265055 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7dz2"] Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.381771 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glrmj\" (UniqueName: \"kubernetes.io/projected/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-kube-api-access-glrmj\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.381829 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.382571 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.382671 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.382812 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.485465 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glrmj\" (UniqueName: \"kubernetes.io/projected/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-kube-api-access-glrmj\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.485537 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.485564 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.485653 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.485685 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.490674 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.491162 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.491789 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.493132 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.505240 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glrmj\" (UniqueName: \"kubernetes.io/projected/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-kube-api-access-glrmj\") pod \"neutron-sriov-openstack-openstack-cell1-rj68z\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:55 crc kubenswrapper[4990]: I1003 12:10:55.575253 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:10:56 crc kubenswrapper[4990]: I1003 12:10:56.990108 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-rj68z"] Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.002656 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.111462 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" event={"ID":"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c","Type":"ContainerStarted","Data":"e9646fbed1777d54f3b007a6ffa7e389c8dc459f2649f7f6ea7ed5579977fd94"} Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.111731 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j7dz2" podUID="51db8203-0744-43df-b04a-997ca66731b3" containerName="registry-server" containerID="cri-o://47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab" gracePeriod=2 Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.705977 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.840424 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-catalog-content\") pod \"51db8203-0744-43df-b04a-997ca66731b3\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.845830 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqxgd\" (UniqueName: \"kubernetes.io/projected/51db8203-0744-43df-b04a-997ca66731b3-kube-api-access-qqxgd\") pod \"51db8203-0744-43df-b04a-997ca66731b3\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.845908 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-utilities\") pod \"51db8203-0744-43df-b04a-997ca66731b3\" (UID: \"51db8203-0744-43df-b04a-997ca66731b3\") " Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.846795 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-utilities" (OuterVolumeSpecName: "utilities") pod "51db8203-0744-43df-b04a-997ca66731b3" (UID: "51db8203-0744-43df-b04a-997ca66731b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.855678 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51db8203-0744-43df-b04a-997ca66731b3" (UID: "51db8203-0744-43df-b04a-997ca66731b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.947664 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:57 crc kubenswrapper[4990]: I1003 12:10:57.947701 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51db8203-0744-43df-b04a-997ca66731b3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.129099 4990 generic.go:334] "Generic (PLEG): container finished" podID="51db8203-0744-43df-b04a-997ca66731b3" containerID="47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab" exitCode=0 Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.129146 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7dz2" event={"ID":"51db8203-0744-43df-b04a-997ca66731b3","Type":"ContainerDied","Data":"47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab"} Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.129175 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j7dz2" event={"ID":"51db8203-0744-43df-b04a-997ca66731b3","Type":"ContainerDied","Data":"9453ba86382bbbf68905185572992af6ee2baa184a03b2abea7a19d3f24a4714"} Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.129197 4990 scope.go:117] "RemoveContainer" containerID="47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.129204 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j7dz2" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.173142 4990 scope.go:117] "RemoveContainer" containerID="969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.379267 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51db8203-0744-43df-b04a-997ca66731b3-kube-api-access-qqxgd" (OuterVolumeSpecName: "kube-api-access-qqxgd") pod "51db8203-0744-43df-b04a-997ca66731b3" (UID: "51db8203-0744-43df-b04a-997ca66731b3"). InnerVolumeSpecName "kube-api-access-qqxgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.405756 4990 scope.go:117] "RemoveContainer" containerID="04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.458405 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqxgd\" (UniqueName: \"kubernetes.io/projected/51db8203-0744-43df-b04a-997ca66731b3-kube-api-access-qqxgd\") on node \"crc\" DevicePath \"\"" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.546379 4990 scope.go:117] "RemoveContainer" containerID="47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab" Oct 03 12:10:58 crc kubenswrapper[4990]: E1003 12:10:58.546848 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab\": container with ID starting with 47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab not found: ID does not exist" containerID="47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.546887 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab"} err="failed to get container status \"47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab\": rpc error: code = NotFound desc = could not find container \"47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab\": container with ID starting with 47513f3a57ca9b81b0a3890f3574650efd6ff5ed64873cf0a13bd768f025fbab not found: ID does not exist" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.546915 4990 scope.go:117] "RemoveContainer" containerID="969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a" Oct 03 12:10:58 crc kubenswrapper[4990]: E1003 12:10:58.547318 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a\": container with ID starting with 969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a not found: ID does not exist" containerID="969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.547374 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a"} err="failed to get container status \"969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a\": rpc error: code = NotFound desc = could not find container \"969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a\": container with ID starting with 969edb471b11e1f5a4dddf57f35a171aeb61af5a17fd628e780a7a09cc47962a not found: ID does not exist" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.547412 4990 scope.go:117] "RemoveContainer" containerID="04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492" Oct 03 12:10:58 crc kubenswrapper[4990]: E1003 12:10:58.548815 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492\": container with ID starting with 04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492 not found: ID does not exist" containerID="04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.548843 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492"} err="failed to get container status \"04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492\": rpc error: code = NotFound desc = could not find container \"04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492\": container with ID starting with 04ea7b9fc87e60da0da0b1a46660f68e1744de72d3e3cb52eafe66ee1be29492 not found: ID does not exist" Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.613952 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7dz2"] Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.622404 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j7dz2"] Oct 03 12:10:58 crc kubenswrapper[4990]: I1003 12:10:58.901284 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51db8203-0744-43df-b04a-997ca66731b3" path="/var/lib/kubelet/pods/51db8203-0744-43df-b04a-997ca66731b3/volumes" Oct 03 12:10:59 crc kubenswrapper[4990]: I1003 12:10:59.142398 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" event={"ID":"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c","Type":"ContainerStarted","Data":"1cc82ccb8a838c2d55e54ed6662b52580da454530e5ad9761e5a48ea86189252"} Oct 03 12:10:59 crc kubenswrapper[4990]: I1003 12:10:59.174856 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" podStartSLOduration=3.622484698 podStartE2EDuration="4.174833024s" podCreationTimestamp="2025-10-03 12:10:55 +0000 UTC" firstStartedPulling="2025-10-03 12:10:57.002310533 +0000 UTC m=+8838.798942400" lastFinishedPulling="2025-10-03 12:10:57.554658869 +0000 UTC m=+8839.351290726" observedRunningTime="2025-10-03 12:10:59.16847839 +0000 UTC m=+8840.965110257" watchObservedRunningTime="2025-10-03 12:10:59.174833024 +0000 UTC m=+8840.971464891" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.847316 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-98xk8"] Oct 03 12:11:00 crc kubenswrapper[4990]: E1003 12:11:00.848114 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51db8203-0744-43df-b04a-997ca66731b3" containerName="extract-content" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.848130 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51db8203-0744-43df-b04a-997ca66731b3" containerName="extract-content" Oct 03 12:11:00 crc kubenswrapper[4990]: E1003 12:11:00.848158 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51db8203-0744-43df-b04a-997ca66731b3" containerName="registry-server" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.848167 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51db8203-0744-43df-b04a-997ca66731b3" containerName="registry-server" Oct 03 12:11:00 crc kubenswrapper[4990]: E1003 12:11:00.848193 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51db8203-0744-43df-b04a-997ca66731b3" containerName="extract-utilities" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.848201 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="51db8203-0744-43df-b04a-997ca66731b3" containerName="extract-utilities" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.848460 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="51db8203-0744-43df-b04a-997ca66731b3" containerName="registry-server" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.850432 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.860616 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98xk8"] Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.945204 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmwzp\" (UniqueName: \"kubernetes.io/projected/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-kube-api-access-pmwzp\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.945472 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-catalog-content\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:00 crc kubenswrapper[4990]: I1003 12:11:00.946222 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-utilities\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:01 crc kubenswrapper[4990]: I1003 12:11:01.048244 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-utilities\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:01 crc kubenswrapper[4990]: I1003 12:11:01.048307 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmwzp\" (UniqueName: \"kubernetes.io/projected/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-kube-api-access-pmwzp\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:01 crc kubenswrapper[4990]: I1003 12:11:01.048363 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-catalog-content\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:01 crc kubenswrapper[4990]: I1003 12:11:01.048795 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-utilities\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:01 crc kubenswrapper[4990]: I1003 12:11:01.048852 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-catalog-content\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:01 crc kubenswrapper[4990]: I1003 12:11:01.070480 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmwzp\" (UniqueName: \"kubernetes.io/projected/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-kube-api-access-pmwzp\") pod \"redhat-operators-98xk8\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:01 crc kubenswrapper[4990]: I1003 12:11:01.189488 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:01 crc kubenswrapper[4990]: I1003 12:11:01.847734 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98xk8"] Oct 03 12:11:01 crc kubenswrapper[4990]: W1003 12:11:01.853117 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebe19c4c_16c9_4380_9f7e_1f5aeaaaa9da.slice/crio-215854ab2c987e30ce06bcd9c219a99393c3d3838e8e3bcb4f8e66ffe665d3ab WatchSource:0}: Error finding container 215854ab2c987e30ce06bcd9c219a99393c3d3838e8e3bcb4f8e66ffe665d3ab: Status 404 returned error can't find the container with id 215854ab2c987e30ce06bcd9c219a99393c3d3838e8e3bcb4f8e66ffe665d3ab Oct 03 12:11:02 crc kubenswrapper[4990]: I1003 12:11:02.172891 4990 generic.go:334] "Generic (PLEG): container finished" podID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerID="b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd" exitCode=0 Oct 03 12:11:02 crc kubenswrapper[4990]: I1003 12:11:02.172934 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98xk8" event={"ID":"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da","Type":"ContainerDied","Data":"b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd"} Oct 03 12:11:02 crc kubenswrapper[4990]: I1003 12:11:02.172957 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98xk8" event={"ID":"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da","Type":"ContainerStarted","Data":"215854ab2c987e30ce06bcd9c219a99393c3d3838e8e3bcb4f8e66ffe665d3ab"} Oct 03 12:11:04 crc kubenswrapper[4990]: I1003 12:11:04.194881 4990 generic.go:334] "Generic (PLEG): container finished" podID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerID="1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db" exitCode=0 Oct 03 12:11:04 crc kubenswrapper[4990]: I1003 12:11:04.194928 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98xk8" event={"ID":"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da","Type":"ContainerDied","Data":"1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db"} Oct 03 12:11:05 crc kubenswrapper[4990]: I1003 12:11:05.208154 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98xk8" event={"ID":"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da","Type":"ContainerStarted","Data":"8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284"} Oct 03 12:11:06 crc kubenswrapper[4990]: I1003 12:11:06.257976 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-98xk8" podStartSLOduration=3.6629992590000002 podStartE2EDuration="6.257950109s" podCreationTimestamp="2025-10-03 12:11:00 +0000 UTC" firstStartedPulling="2025-10-03 12:11:02.174446664 +0000 UTC m=+8843.971078521" lastFinishedPulling="2025-10-03 12:11:04.769397484 +0000 UTC m=+8846.566029371" observedRunningTime="2025-10-03 12:11:06.242039199 +0000 UTC m=+8848.038671076" watchObservedRunningTime="2025-10-03 12:11:06.257950109 +0000 UTC m=+8848.054582006" Oct 03 12:11:11 crc kubenswrapper[4990]: I1003 12:11:11.189964 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:11 crc kubenswrapper[4990]: I1003 12:11:11.190885 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:11 crc kubenswrapper[4990]: I1003 12:11:11.284913 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:12 crc kubenswrapper[4990]: I1003 12:11:12.372628 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:12 crc kubenswrapper[4990]: I1003 12:11:12.440377 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98xk8"] Oct 03 12:11:14 crc kubenswrapper[4990]: I1003 12:11:14.313451 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-98xk8" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerName="registry-server" containerID="cri-o://8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284" gracePeriod=2 Oct 03 12:11:14 crc kubenswrapper[4990]: I1003 12:11:14.906744 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.000239 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmwzp\" (UniqueName: \"kubernetes.io/projected/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-kube-api-access-pmwzp\") pod \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.000374 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-utilities\") pod \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.000452 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-catalog-content\") pod \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\" (UID: \"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da\") " Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.005238 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-utilities" (OuterVolumeSpecName: "utilities") pod "ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" (UID: "ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.013806 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-kube-api-access-pmwzp" (OuterVolumeSpecName: "kube-api-access-pmwzp") pod "ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" (UID: "ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da"). InnerVolumeSpecName "kube-api-access-pmwzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.101885 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" (UID: "ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.104904 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmwzp\" (UniqueName: \"kubernetes.io/projected/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-kube-api-access-pmwzp\") on node \"crc\" DevicePath \"\"" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.104972 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.104994 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.333321 4990 generic.go:334] "Generic (PLEG): container finished" podID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerID="8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284" exitCode=0 Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.333453 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98xk8" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.333479 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98xk8" event={"ID":"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da","Type":"ContainerDied","Data":"8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284"} Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.334998 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98xk8" event={"ID":"ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da","Type":"ContainerDied","Data":"215854ab2c987e30ce06bcd9c219a99393c3d3838e8e3bcb4f8e66ffe665d3ab"} Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.335053 4990 scope.go:117] "RemoveContainer" containerID="8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.378711 4990 scope.go:117] "RemoveContainer" containerID="1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.425799 4990 scope.go:117] "RemoveContainer" containerID="b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.439495 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98xk8"] Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.451363 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-98xk8"] Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.456704 4990 scope.go:117] "RemoveContainer" containerID="8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284" Oct 03 12:11:15 crc kubenswrapper[4990]: E1003 12:11:15.457493 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284\": container with ID starting with 8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284 not found: ID does not exist" containerID="8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.457596 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284"} err="failed to get container status \"8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284\": rpc error: code = NotFound desc = could not find container \"8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284\": container with ID starting with 8110e47e40acef271485210c86f03e5b735e93c08946bc752293426f0f574284 not found: ID does not exist" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.457648 4990 scope.go:117] "RemoveContainer" containerID="1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db" Oct 03 12:11:15 crc kubenswrapper[4990]: E1003 12:11:15.459171 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db\": container with ID starting with 1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db not found: ID does not exist" containerID="1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.459207 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db"} err="failed to get container status \"1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db\": rpc error: code = NotFound desc = could not find container \"1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db\": container with ID starting with 1036653a7212b8539493333fdabdb66139ce910136df0e428588b8cc9aeb25db not found: ID does not exist" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.459228 4990 scope.go:117] "RemoveContainer" containerID="b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd" Oct 03 12:11:15 crc kubenswrapper[4990]: E1003 12:11:15.459627 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd\": container with ID starting with b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd not found: ID does not exist" containerID="b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd" Oct 03 12:11:15 crc kubenswrapper[4990]: I1003 12:11:15.459679 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd"} err="failed to get container status \"b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd\": rpc error: code = NotFound desc = could not find container \"b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd\": container with ID starting with b70bf63d314e8137e7ca219a0faccc263ec503789a7cec2606c8c048f0203afd not found: ID does not exist" Oct 03 12:11:16 crc kubenswrapper[4990]: I1003 12:11:16.890971 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" path="/var/lib/kubelet/pods/ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da/volumes" Oct 03 12:12:25 crc kubenswrapper[4990]: I1003 12:12:25.304187 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:12:25 crc kubenswrapper[4990]: I1003 12:12:25.304992 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:12:55 crc kubenswrapper[4990]: I1003 12:12:55.303967 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:12:55 crc kubenswrapper[4990]: I1003 12:12:55.304723 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:13:04 crc kubenswrapper[4990]: I1003 12:13:04.731729 4990 generic.go:334] "Generic (PLEG): container finished" podID="e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" containerID="1cc82ccb8a838c2d55e54ed6662b52580da454530e5ad9761e5a48ea86189252" exitCode=0 Oct 03 12:13:04 crc kubenswrapper[4990]: I1003 12:13:04.731822 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" event={"ID":"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c","Type":"ContainerDied","Data":"1cc82ccb8a838c2d55e54ed6662b52580da454530e5ad9761e5a48ea86189252"} Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.239429 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.317689 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-agent-neutron-config-0\") pod \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.317899 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-inventory\") pod \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.317945 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-ssh-key\") pod \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.318051 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-combined-ca-bundle\") pod \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.318100 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glrmj\" (UniqueName: \"kubernetes.io/projected/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-kube-api-access-glrmj\") pod \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\" (UID: \"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c\") " Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.326193 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-kube-api-access-glrmj" (OuterVolumeSpecName: "kube-api-access-glrmj") pod "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" (UID: "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c"). InnerVolumeSpecName "kube-api-access-glrmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.329178 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" (UID: "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.347145 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-inventory" (OuterVolumeSpecName: "inventory") pod "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" (UID: "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.349191 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" (UID: "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.352886 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" (UID: "e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.420760 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.420802 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.420818 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.420830 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.420846 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glrmj\" (UniqueName: \"kubernetes.io/projected/e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c-kube-api-access-glrmj\") on node \"crc\" DevicePath \"\"" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.757158 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" event={"ID":"e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c","Type":"ContainerDied","Data":"e9646fbed1777d54f3b007a6ffa7e389c8dc459f2649f7f6ea7ed5579977fd94"} Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.757211 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9646fbed1777d54f3b007a6ffa7e389c8dc459f2649f7f6ea7ed5579977fd94" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.757222 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-rj68z" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.860009 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-d77pm"] Oct 03 12:13:06 crc kubenswrapper[4990]: E1003 12:13:06.860891 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerName="extract-utilities" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.860917 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerName="extract-utilities" Oct 03 12:13:06 crc kubenswrapper[4990]: E1003 12:13:06.860954 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerName="registry-server" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.860963 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerName="registry-server" Oct 03 12:13:06 crc kubenswrapper[4990]: E1003 12:13:06.860979 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" containerName="neutron-sriov-openstack-openstack-cell1" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.860988 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" containerName="neutron-sriov-openstack-openstack-cell1" Oct 03 12:13:06 crc kubenswrapper[4990]: E1003 12:13:06.861020 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerName="extract-content" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.861027 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerName="extract-content" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.861278 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe19c4c-16c9-4380-9f7e-1f5aeaaaa9da" containerName="registry-server" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.861303 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c" containerName="neutron-sriov-openstack-openstack-cell1" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.862275 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.864384 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.864565 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.865799 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.867204 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.872066 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.889008 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-d77pm"] Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.933962 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.934039 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.934078 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.934102 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:06 crc kubenswrapper[4990]: I1003 12:13:06.934129 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btmtk\" (UniqueName: \"kubernetes.io/projected/11994fd9-0d27-4cb9-9871-40e9416961fd-kube-api-access-btmtk\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.035529 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.035625 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.035661 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.035683 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.035715 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btmtk\" (UniqueName: \"kubernetes.io/projected/11994fd9-0d27-4cb9-9871-40e9416961fd-kube-api-access-btmtk\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.040197 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.040875 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.041200 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.041575 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.055375 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btmtk\" (UniqueName: \"kubernetes.io/projected/11994fd9-0d27-4cb9-9871-40e9416961fd-kube-api-access-btmtk\") pod \"neutron-dhcp-openstack-openstack-cell1-d77pm\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.191004 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:13:07 crc kubenswrapper[4990]: I1003 12:13:07.798084 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-d77pm"] Oct 03 12:13:07 crc kubenswrapper[4990]: W1003 12:13:07.801655 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11994fd9_0d27_4cb9_9871_40e9416961fd.slice/crio-31cb3a47610a4e1aaae9f8c151f67f4ab0dd76da40dd377417fa93372780534e WatchSource:0}: Error finding container 31cb3a47610a4e1aaae9f8c151f67f4ab0dd76da40dd377417fa93372780534e: Status 404 returned error can't find the container with id 31cb3a47610a4e1aaae9f8c151f67f4ab0dd76da40dd377417fa93372780534e Oct 03 12:13:08 crc kubenswrapper[4990]: I1003 12:13:08.777765 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" event={"ID":"11994fd9-0d27-4cb9-9871-40e9416961fd","Type":"ContainerStarted","Data":"017c147b63986930061cc502d1c6c1058d129f65ce2f567b32f3776e9939fde5"} Oct 03 12:13:08 crc kubenswrapper[4990]: I1003 12:13:08.778012 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" event={"ID":"11994fd9-0d27-4cb9-9871-40e9416961fd","Type":"ContainerStarted","Data":"31cb3a47610a4e1aaae9f8c151f67f4ab0dd76da40dd377417fa93372780534e"} Oct 03 12:13:08 crc kubenswrapper[4990]: I1003 12:13:08.798815 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" podStartSLOduration=2.288301156 podStartE2EDuration="2.79879142s" podCreationTimestamp="2025-10-03 12:13:06 +0000 UTC" firstStartedPulling="2025-10-03 12:13:07.804609374 +0000 UTC m=+8969.601241261" lastFinishedPulling="2025-10-03 12:13:08.315099618 +0000 UTC m=+8970.111731525" observedRunningTime="2025-10-03 12:13:08.796218074 +0000 UTC m=+8970.592849941" watchObservedRunningTime="2025-10-03 12:13:08.79879142 +0000 UTC m=+8970.595423287" Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.303544 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.304277 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.304331 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.305312 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45a8888ec585ce51f377ce4c6e5d609c4d53f7840ec936fba419409c317886fc"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.305384 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://45a8888ec585ce51f377ce4c6e5d609c4d53f7840ec936fba419409c317886fc" gracePeriod=600 Oct 03 12:13:25 crc kubenswrapper[4990]: E1003 12:13:25.550210 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21ea38c_26da_4987_a50d_bafecdfbbd02.slice/crio-conmon-45a8888ec585ce51f377ce4c6e5d609c4d53f7840ec936fba419409c317886fc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21ea38c_26da_4987_a50d_bafecdfbbd02.slice/crio-45a8888ec585ce51f377ce4c6e5d609c4d53f7840ec936fba419409c317886fc.scope\": RecentStats: unable to find data in memory cache]" Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.982661 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="45a8888ec585ce51f377ce4c6e5d609c4d53f7840ec936fba419409c317886fc" exitCode=0 Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.982833 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"45a8888ec585ce51f377ce4c6e5d609c4d53f7840ec936fba419409c317886fc"} Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.983447 4990 scope.go:117] "RemoveContainer" containerID="521d988e53a6a4154824496a99481f618463c43174d4235684a81eecba1b00c9" Oct 03 12:13:25 crc kubenswrapper[4990]: I1003 12:13:25.984773 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0"} Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.155579 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq"] Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.158027 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.164470 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq"] Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.206154 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.206242 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.249941 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7g7\" (UniqueName: \"kubernetes.io/projected/41b10aa5-2c5f-4f95-b1a4-179c67551993-kube-api-access-gt7g7\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.249999 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41b10aa5-2c5f-4f95-b1a4-179c67551993-secret-volume\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.250108 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41b10aa5-2c5f-4f95-b1a4-179c67551993-config-volume\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.352177 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7g7\" (UniqueName: \"kubernetes.io/projected/41b10aa5-2c5f-4f95-b1a4-179c67551993-kube-api-access-gt7g7\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.352285 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41b10aa5-2c5f-4f95-b1a4-179c67551993-secret-volume\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.353496 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41b10aa5-2c5f-4f95-b1a4-179c67551993-config-volume\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.354575 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41b10aa5-2c5f-4f95-b1a4-179c67551993-config-volume\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.364833 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41b10aa5-2c5f-4f95-b1a4-179c67551993-secret-volume\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.378383 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7g7\" (UniqueName: \"kubernetes.io/projected/41b10aa5-2c5f-4f95-b1a4-179c67551993-kube-api-access-gt7g7\") pod \"collect-profiles-29324895-n82jq\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:00 crc kubenswrapper[4990]: I1003 12:15:00.532630 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:01 crc kubenswrapper[4990]: I1003 12:15:01.024850 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq"] Oct 03 12:15:01 crc kubenswrapper[4990]: W1003 12:15:01.035567 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b10aa5_2c5f_4f95_b1a4_179c67551993.slice/crio-1ff743bcbd808c13e4a0f723b1db57f75acfa53bba30d58ec5955a2b9b30d713 WatchSource:0}: Error finding container 1ff743bcbd808c13e4a0f723b1db57f75acfa53bba30d58ec5955a2b9b30d713: Status 404 returned error can't find the container with id 1ff743bcbd808c13e4a0f723b1db57f75acfa53bba30d58ec5955a2b9b30d713 Oct 03 12:15:01 crc kubenswrapper[4990]: I1003 12:15:01.098175 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" event={"ID":"41b10aa5-2c5f-4f95-b1a4-179c67551993","Type":"ContainerStarted","Data":"1ff743bcbd808c13e4a0f723b1db57f75acfa53bba30d58ec5955a2b9b30d713"} Oct 03 12:15:02 crc kubenswrapper[4990]: I1003 12:15:02.115646 4990 generic.go:334] "Generic (PLEG): container finished" podID="41b10aa5-2c5f-4f95-b1a4-179c67551993" containerID="f751e5354d9397b1704f021db9f33067e88996b39a6e45255f58725bdc53ef56" exitCode=0 Oct 03 12:15:02 crc kubenswrapper[4990]: I1003 12:15:02.115788 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" event={"ID":"41b10aa5-2c5f-4f95-b1a4-179c67551993","Type":"ContainerDied","Data":"f751e5354d9397b1704f021db9f33067e88996b39a6e45255f58725bdc53ef56"} Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.579151 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.629888 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41b10aa5-2c5f-4f95-b1a4-179c67551993-secret-volume\") pod \"41b10aa5-2c5f-4f95-b1a4-179c67551993\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.630442 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41b10aa5-2c5f-4f95-b1a4-179c67551993-config-volume\") pod \"41b10aa5-2c5f-4f95-b1a4-179c67551993\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.630497 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt7g7\" (UniqueName: \"kubernetes.io/projected/41b10aa5-2c5f-4f95-b1a4-179c67551993-kube-api-access-gt7g7\") pod \"41b10aa5-2c5f-4f95-b1a4-179c67551993\" (UID: \"41b10aa5-2c5f-4f95-b1a4-179c67551993\") " Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.631994 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41b10aa5-2c5f-4f95-b1a4-179c67551993-config-volume" (OuterVolumeSpecName: "config-volume") pod "41b10aa5-2c5f-4f95-b1a4-179c67551993" (UID: "41b10aa5-2c5f-4f95-b1a4-179c67551993"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.636563 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b10aa5-2c5f-4f95-b1a4-179c67551993-kube-api-access-gt7g7" (OuterVolumeSpecName: "kube-api-access-gt7g7") pod "41b10aa5-2c5f-4f95-b1a4-179c67551993" (UID: "41b10aa5-2c5f-4f95-b1a4-179c67551993"). InnerVolumeSpecName "kube-api-access-gt7g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.637124 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b10aa5-2c5f-4f95-b1a4-179c67551993-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "41b10aa5-2c5f-4f95-b1a4-179c67551993" (UID: "41b10aa5-2c5f-4f95-b1a4-179c67551993"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.737590 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41b10aa5-2c5f-4f95-b1a4-179c67551993-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.737624 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt7g7\" (UniqueName: \"kubernetes.io/projected/41b10aa5-2c5f-4f95-b1a4-179c67551993-kube-api-access-gt7g7\") on node \"crc\" DevicePath \"\"" Oct 03 12:15:03 crc kubenswrapper[4990]: I1003 12:15:03.737636 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41b10aa5-2c5f-4f95-b1a4-179c67551993-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:15:04 crc kubenswrapper[4990]: I1003 12:15:04.139378 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" event={"ID":"41b10aa5-2c5f-4f95-b1a4-179c67551993","Type":"ContainerDied","Data":"1ff743bcbd808c13e4a0f723b1db57f75acfa53bba30d58ec5955a2b9b30d713"} Oct 03 12:15:04 crc kubenswrapper[4990]: I1003 12:15:04.139420 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ff743bcbd808c13e4a0f723b1db57f75acfa53bba30d58ec5955a2b9b30d713" Oct 03 12:15:04 crc kubenswrapper[4990]: I1003 12:15:04.139475 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324895-n82jq" Oct 03 12:15:04 crc kubenswrapper[4990]: I1003 12:15:04.671661 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz"] Oct 03 12:15:04 crc kubenswrapper[4990]: I1003 12:15:04.684598 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324850-sjvzz"] Oct 03 12:15:04 crc kubenswrapper[4990]: I1003 12:15:04.889800 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4070aa48-04e2-4498-a16d-7f20675e55c7" path="/var/lib/kubelet/pods/4070aa48-04e2-4498-a16d-7f20675e55c7/volumes" Oct 03 12:15:25 crc kubenswrapper[4990]: I1003 12:15:25.303865 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:15:25 crc kubenswrapper[4990]: I1003 12:15:25.304624 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:15:54 crc kubenswrapper[4990]: I1003 12:15:54.934091 4990 scope.go:117] "RemoveContainer" containerID="db2420606c169afcbcab781b7a7639f76f63b010145ab959711f2cd1a6c2b4e6" Oct 03 12:15:55 crc kubenswrapper[4990]: I1003 12:15:55.304409 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:15:55 crc kubenswrapper[4990]: I1003 12:15:55.304550 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:16:25 crc kubenswrapper[4990]: I1003 12:16:25.305604 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:16:25 crc kubenswrapper[4990]: I1003 12:16:25.307183 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:16:25 crc kubenswrapper[4990]: I1003 12:16:25.307286 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 12:16:25 crc kubenswrapper[4990]: I1003 12:16:25.307971 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:16:25 crc kubenswrapper[4990]: I1003 12:16:25.308094 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" gracePeriod=600 Oct 03 12:16:25 crc kubenswrapper[4990]: E1003 12:16:25.432748 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:16:26 crc kubenswrapper[4990]: I1003 12:16:26.158228 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" exitCode=0 Oct 03 12:16:26 crc kubenswrapper[4990]: I1003 12:16:26.158316 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0"} Oct 03 12:16:26 crc kubenswrapper[4990]: I1003 12:16:26.158945 4990 scope.go:117] "RemoveContainer" containerID="45a8888ec585ce51f377ce4c6e5d609c4d53f7840ec936fba419409c317886fc" Oct 03 12:16:26 crc kubenswrapper[4990]: I1003 12:16:26.159748 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:16:26 crc kubenswrapper[4990]: E1003 12:16:26.160135 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:16:36 crc kubenswrapper[4990]: I1003 12:16:36.871242 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:16:36 crc kubenswrapper[4990]: E1003 12:16:36.872023 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:16:47 crc kubenswrapper[4990]: I1003 12:16:47.872421 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:16:47 crc kubenswrapper[4990]: E1003 12:16:47.873444 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:17:00 crc kubenswrapper[4990]: I1003 12:17:00.871985 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:17:00 crc kubenswrapper[4990]: E1003 12:17:00.873059 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:17:15 crc kubenswrapper[4990]: I1003 12:17:15.871638 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:17:15 crc kubenswrapper[4990]: E1003 12:17:15.872597 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:17:28 crc kubenswrapper[4990]: I1003 12:17:28.881973 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:17:28 crc kubenswrapper[4990]: E1003 12:17:28.882825 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:17:39 crc kubenswrapper[4990]: I1003 12:17:39.872470 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:17:39 crc kubenswrapper[4990]: E1003 12:17:39.873735 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:17:51 crc kubenswrapper[4990]: I1003 12:17:51.872893 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:17:51 crc kubenswrapper[4990]: E1003 12:17:51.874208 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:18:06 crc kubenswrapper[4990]: I1003 12:18:06.872696 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:18:06 crc kubenswrapper[4990]: E1003 12:18:06.874989 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:18:19 crc kubenswrapper[4990]: I1003 12:18:19.872371 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:18:19 crc kubenswrapper[4990]: E1003 12:18:19.873696 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.419440 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bhj6n"] Oct 03 12:18:22 crc kubenswrapper[4990]: E1003 12:18:22.420303 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b10aa5-2c5f-4f95-b1a4-179c67551993" containerName="collect-profiles" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.420321 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b10aa5-2c5f-4f95-b1a4-179c67551993" containerName="collect-profiles" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.420587 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b10aa5-2c5f-4f95-b1a4-179c67551993" containerName="collect-profiles" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.422532 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.432978 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhj6n"] Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.479193 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-catalog-content\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.479256 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-utilities\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.479973 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h8q8\" (UniqueName: \"kubernetes.io/projected/2455fdfb-959d-4002-9d1a-67f7065b0757-kube-api-access-7h8q8\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.582460 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h8q8\" (UniqueName: \"kubernetes.io/projected/2455fdfb-959d-4002-9d1a-67f7065b0757-kube-api-access-7h8q8\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.582964 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-catalog-content\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.583132 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-utilities\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.583368 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-catalog-content\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.583487 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-utilities\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.605190 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h8q8\" (UniqueName: \"kubernetes.io/projected/2455fdfb-959d-4002-9d1a-67f7065b0757-kube-api-access-7h8q8\") pod \"community-operators-bhj6n\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:22 crc kubenswrapper[4990]: I1003 12:18:22.749988 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:23 crc kubenswrapper[4990]: I1003 12:18:23.247232 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhj6n"] Oct 03 12:18:23 crc kubenswrapper[4990]: I1003 12:18:23.682016 4990 generic.go:334] "Generic (PLEG): container finished" podID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerID="cf01df0590a8514d177f06b461c04598c9d96e3e0b67ac2f69d2e5d1c5dd6786" exitCode=0 Oct 03 12:18:23 crc kubenswrapper[4990]: I1003 12:18:23.682133 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhj6n" event={"ID":"2455fdfb-959d-4002-9d1a-67f7065b0757","Type":"ContainerDied","Data":"cf01df0590a8514d177f06b461c04598c9d96e3e0b67ac2f69d2e5d1c5dd6786"} Oct 03 12:18:23 crc kubenswrapper[4990]: I1003 12:18:23.682350 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhj6n" event={"ID":"2455fdfb-959d-4002-9d1a-67f7065b0757","Type":"ContainerStarted","Data":"0a2085c2bbe1cb8dcdfddd2e3090ff855fa8845be7e794cd9ba8b97aea9eda84"} Oct 03 12:18:23 crc kubenswrapper[4990]: I1003 12:18:23.686876 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 12:18:25 crc kubenswrapper[4990]: I1003 12:18:25.708780 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhj6n" event={"ID":"2455fdfb-959d-4002-9d1a-67f7065b0757","Type":"ContainerStarted","Data":"0e180a9c601330bbc1c511af56570d8020efc4517ff803b976e1422c73b58925"} Oct 03 12:18:27 crc kubenswrapper[4990]: I1003 12:18:27.731540 4990 generic.go:334] "Generic (PLEG): container finished" podID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerID="0e180a9c601330bbc1c511af56570d8020efc4517ff803b976e1422c73b58925" exitCode=0 Oct 03 12:18:27 crc kubenswrapper[4990]: I1003 12:18:27.731627 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhj6n" event={"ID":"2455fdfb-959d-4002-9d1a-67f7065b0757","Type":"ContainerDied","Data":"0e180a9c601330bbc1c511af56570d8020efc4517ff803b976e1422c73b58925"} Oct 03 12:18:28 crc kubenswrapper[4990]: I1003 12:18:28.761045 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhj6n" event={"ID":"2455fdfb-959d-4002-9d1a-67f7065b0757","Type":"ContainerStarted","Data":"b40a4587e0237a7f7cab79115c53be2cf9a378ec6e113140bd0223d549474a76"} Oct 03 12:18:28 crc kubenswrapper[4990]: I1003 12:18:28.790410 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bhj6n" podStartSLOduration=2.042641834 podStartE2EDuration="6.790390198s" podCreationTimestamp="2025-10-03 12:18:22 +0000 UTC" firstStartedPulling="2025-10-03 12:18:23.686415528 +0000 UTC m=+9285.483047395" lastFinishedPulling="2025-10-03 12:18:28.434163902 +0000 UTC m=+9290.230795759" observedRunningTime="2025-10-03 12:18:28.78544451 +0000 UTC m=+9290.582076407" watchObservedRunningTime="2025-10-03 12:18:28.790390198 +0000 UTC m=+9290.587022055" Oct 03 12:18:30 crc kubenswrapper[4990]: I1003 12:18:30.871985 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:18:30 crc kubenswrapper[4990]: E1003 12:18:30.872552 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:18:32 crc kubenswrapper[4990]: I1003 12:18:32.750460 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:32 crc kubenswrapper[4990]: I1003 12:18:32.750861 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:32 crc kubenswrapper[4990]: I1003 12:18:32.815299 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:33 crc kubenswrapper[4990]: I1003 12:18:33.856760 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:33 crc kubenswrapper[4990]: I1003 12:18:33.920341 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bhj6n"] Oct 03 12:18:35 crc kubenswrapper[4990]: I1003 12:18:35.831717 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bhj6n" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerName="registry-server" containerID="cri-o://b40a4587e0237a7f7cab79115c53be2cf9a378ec6e113140bd0223d549474a76" gracePeriod=2 Oct 03 12:18:36 crc kubenswrapper[4990]: I1003 12:18:36.846436 4990 generic.go:334] "Generic (PLEG): container finished" podID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerID="b40a4587e0237a7f7cab79115c53be2cf9a378ec6e113140bd0223d549474a76" exitCode=0 Oct 03 12:18:36 crc kubenswrapper[4990]: I1003 12:18:36.846573 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhj6n" event={"ID":"2455fdfb-959d-4002-9d1a-67f7065b0757","Type":"ContainerDied","Data":"b40a4587e0237a7f7cab79115c53be2cf9a378ec6e113140bd0223d549474a76"} Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.028896 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.225617 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h8q8\" (UniqueName: \"kubernetes.io/projected/2455fdfb-959d-4002-9d1a-67f7065b0757-kube-api-access-7h8q8\") pod \"2455fdfb-959d-4002-9d1a-67f7065b0757\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.226005 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-utilities\") pod \"2455fdfb-959d-4002-9d1a-67f7065b0757\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.226158 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-catalog-content\") pod \"2455fdfb-959d-4002-9d1a-67f7065b0757\" (UID: \"2455fdfb-959d-4002-9d1a-67f7065b0757\") " Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.226994 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-utilities" (OuterVolumeSpecName: "utilities") pod "2455fdfb-959d-4002-9d1a-67f7065b0757" (UID: "2455fdfb-959d-4002-9d1a-67f7065b0757"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.233822 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2455fdfb-959d-4002-9d1a-67f7065b0757-kube-api-access-7h8q8" (OuterVolumeSpecName: "kube-api-access-7h8q8") pod "2455fdfb-959d-4002-9d1a-67f7065b0757" (UID: "2455fdfb-959d-4002-9d1a-67f7065b0757"). InnerVolumeSpecName "kube-api-access-7h8q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.330073 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.330103 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h8q8\" (UniqueName: \"kubernetes.io/projected/2455fdfb-959d-4002-9d1a-67f7065b0757-kube-api-access-7h8q8\") on node \"crc\" DevicePath \"\"" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.341911 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2455fdfb-959d-4002-9d1a-67f7065b0757" (UID: "2455fdfb-959d-4002-9d1a-67f7065b0757"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.431696 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2455fdfb-959d-4002-9d1a-67f7065b0757-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.862471 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhj6n" event={"ID":"2455fdfb-959d-4002-9d1a-67f7065b0757","Type":"ContainerDied","Data":"0a2085c2bbe1cb8dcdfddd2e3090ff855fa8845be7e794cd9ba8b97aea9eda84"} Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.862796 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhj6n" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.863693 4990 scope.go:117] "RemoveContainer" containerID="b40a4587e0237a7f7cab79115c53be2cf9a378ec6e113140bd0223d549474a76" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.911965 4990 scope.go:117] "RemoveContainer" containerID="0e180a9c601330bbc1c511af56570d8020efc4517ff803b976e1422c73b58925" Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.924154 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bhj6n"] Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.937248 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bhj6n"] Oct 03 12:18:37 crc kubenswrapper[4990]: I1003 12:18:37.957750 4990 scope.go:117] "RemoveContainer" containerID="cf01df0590a8514d177f06b461c04598c9d96e3e0b67ac2f69d2e5d1c5dd6786" Oct 03 12:18:38 crc kubenswrapper[4990]: I1003 12:18:38.889052 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" path="/var/lib/kubelet/pods/2455fdfb-959d-4002-9d1a-67f7065b0757/volumes" Oct 03 12:18:45 crc kubenswrapper[4990]: I1003 12:18:45.872612 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:18:45 crc kubenswrapper[4990]: E1003 12:18:45.873498 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:18:57 crc kubenswrapper[4990]: I1003 12:18:57.872162 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:18:57 crc kubenswrapper[4990]: E1003 12:18:57.873262 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.037024 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gn5hg"] Oct 03 12:19:07 crc kubenswrapper[4990]: E1003 12:19:07.037817 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerName="extract-content" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.037830 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerName="extract-content" Oct 03 12:19:07 crc kubenswrapper[4990]: E1003 12:19:07.037857 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerName="registry-server" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.037863 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerName="registry-server" Oct 03 12:19:07 crc kubenswrapper[4990]: E1003 12:19:07.037883 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerName="extract-utilities" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.037889 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerName="extract-utilities" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.038063 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="2455fdfb-959d-4002-9d1a-67f7065b0757" containerName="registry-server" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.039431 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.052495 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn5hg"] Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.185889 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336d35c-b95d-4fbf-90e4-de018ff26f36-catalog-content\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.186713 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336d35c-b95d-4fbf-90e4-de018ff26f36-utilities\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.186966 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2pv9\" (UniqueName: \"kubernetes.io/projected/3336d35c-b95d-4fbf-90e4-de018ff26f36-kube-api-access-x2pv9\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.289379 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336d35c-b95d-4fbf-90e4-de018ff26f36-catalog-content\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.289544 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336d35c-b95d-4fbf-90e4-de018ff26f36-utilities\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.289623 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2pv9\" (UniqueName: \"kubernetes.io/projected/3336d35c-b95d-4fbf-90e4-de018ff26f36-kube-api-access-x2pv9\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.289906 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3336d35c-b95d-4fbf-90e4-de018ff26f36-catalog-content\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.290248 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3336d35c-b95d-4fbf-90e4-de018ff26f36-utilities\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.379549 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2pv9\" (UniqueName: \"kubernetes.io/projected/3336d35c-b95d-4fbf-90e4-de018ff26f36-kube-api-access-x2pv9\") pod \"certified-operators-gn5hg\" (UID: \"3336d35c-b95d-4fbf-90e4-de018ff26f36\") " pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:07 crc kubenswrapper[4990]: I1003 12:19:07.673178 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:08 crc kubenswrapper[4990]: I1003 12:19:08.183617 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn5hg"] Oct 03 12:19:08 crc kubenswrapper[4990]: I1003 12:19:08.253856 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5hg" event={"ID":"3336d35c-b95d-4fbf-90e4-de018ff26f36","Type":"ContainerStarted","Data":"c5f08dd42e478dcccc911034dce9849ffd988faa5fff709ba259a66338a75a3a"} Oct 03 12:19:08 crc kubenswrapper[4990]: I1003 12:19:08.889354 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:19:08 crc kubenswrapper[4990]: E1003 12:19:08.890025 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:19:09 crc kubenswrapper[4990]: I1003 12:19:09.266880 4990 generic.go:334] "Generic (PLEG): container finished" podID="3336d35c-b95d-4fbf-90e4-de018ff26f36" containerID="59069a311fe7bbc67c4f4f62ad43d516aa7f4951ae540f306923a6415fed3929" exitCode=0 Oct 03 12:19:09 crc kubenswrapper[4990]: I1003 12:19:09.266924 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5hg" event={"ID":"3336d35c-b95d-4fbf-90e4-de018ff26f36","Type":"ContainerDied","Data":"59069a311fe7bbc67c4f4f62ad43d516aa7f4951ae540f306923a6415fed3929"} Oct 03 12:19:15 crc kubenswrapper[4990]: I1003 12:19:15.368907 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5hg" event={"ID":"3336d35c-b95d-4fbf-90e4-de018ff26f36","Type":"ContainerStarted","Data":"e2e344775b58bdacc6d6558185dba5d4da2c133fa4a02d6b07f56cb40e4b75a6"} Oct 03 12:19:16 crc kubenswrapper[4990]: I1003 12:19:16.379198 4990 generic.go:334] "Generic (PLEG): container finished" podID="3336d35c-b95d-4fbf-90e4-de018ff26f36" containerID="e2e344775b58bdacc6d6558185dba5d4da2c133fa4a02d6b07f56cb40e4b75a6" exitCode=0 Oct 03 12:19:16 crc kubenswrapper[4990]: I1003 12:19:16.379295 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5hg" event={"ID":"3336d35c-b95d-4fbf-90e4-de018ff26f36","Type":"ContainerDied","Data":"e2e344775b58bdacc6d6558185dba5d4da2c133fa4a02d6b07f56cb40e4b75a6"} Oct 03 12:19:19 crc kubenswrapper[4990]: I1003 12:19:19.416632 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5hg" event={"ID":"3336d35c-b95d-4fbf-90e4-de018ff26f36","Type":"ContainerStarted","Data":"850ca87e34a2aaa4b167318980e3ffdd0aed315397d12e819ef6352c50b73b01"} Oct 03 12:19:19 crc kubenswrapper[4990]: I1003 12:19:19.453681 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gn5hg" podStartSLOduration=3.738304958 podStartE2EDuration="12.453659252s" podCreationTimestamp="2025-10-03 12:19:07 +0000 UTC" firstStartedPulling="2025-10-03 12:19:09.26924494 +0000 UTC m=+9331.065876787" lastFinishedPulling="2025-10-03 12:19:17.984599224 +0000 UTC m=+9339.781231081" observedRunningTime="2025-10-03 12:19:19.449205046 +0000 UTC m=+9341.245836903" watchObservedRunningTime="2025-10-03 12:19:19.453659252 +0000 UTC m=+9341.250291109" Oct 03 12:19:23 crc kubenswrapper[4990]: I1003 12:19:23.872804 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:19:23 crc kubenswrapper[4990]: E1003 12:19:23.873785 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:19:27 crc kubenswrapper[4990]: I1003 12:19:27.673835 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:27 crc kubenswrapper[4990]: I1003 12:19:27.674458 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:27 crc kubenswrapper[4990]: I1003 12:19:27.758677 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:28 crc kubenswrapper[4990]: I1003 12:19:28.580169 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gn5hg" Oct 03 12:19:28 crc kubenswrapper[4990]: I1003 12:19:28.660590 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn5hg"] Oct 03 12:19:28 crc kubenswrapper[4990]: I1003 12:19:28.718121 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmqjm"] Oct 03 12:19:28 crc kubenswrapper[4990]: I1003 12:19:28.718582 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xmqjm" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerName="registry-server" containerID="cri-o://d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975" gracePeriod=2 Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.228928 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.419754 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4958\" (UniqueName: \"kubernetes.io/projected/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-kube-api-access-l4958\") pod \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.419839 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-catalog-content\") pod \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.419877 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-utilities\") pod \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\" (UID: \"6cb448d7-b091-44b0-8db2-fa222f3f5ea2\") " Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.420785 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-utilities" (OuterVolumeSpecName: "utilities") pod "6cb448d7-b091-44b0-8db2-fa222f3f5ea2" (UID: "6cb448d7-b091-44b0-8db2-fa222f3f5ea2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.427167 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-kube-api-access-l4958" (OuterVolumeSpecName: "kube-api-access-l4958") pod "6cb448d7-b091-44b0-8db2-fa222f3f5ea2" (UID: "6cb448d7-b091-44b0-8db2-fa222f3f5ea2"). InnerVolumeSpecName "kube-api-access-l4958". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.460772 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cb448d7-b091-44b0-8db2-fa222f3f5ea2" (UID: "6cb448d7-b091-44b0-8db2-fa222f3f5ea2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.522034 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4958\" (UniqueName: \"kubernetes.io/projected/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-kube-api-access-l4958\") on node \"crc\" DevicePath \"\"" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.522066 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.522077 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cb448d7-b091-44b0-8db2-fa222f3f5ea2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.540013 4990 generic.go:334] "Generic (PLEG): container finished" podID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerID="d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975" exitCode=0 Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.540073 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmqjm" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.540155 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmqjm" event={"ID":"6cb448d7-b091-44b0-8db2-fa222f3f5ea2","Type":"ContainerDied","Data":"d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975"} Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.540187 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmqjm" event={"ID":"6cb448d7-b091-44b0-8db2-fa222f3f5ea2","Type":"ContainerDied","Data":"a70f7afef70b2ce5bbf0fb0dd668f59dc1c331c8f87f993eb7740942bedfe7b9"} Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.540208 4990 scope.go:117] "RemoveContainer" containerID="d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.578948 4990 scope.go:117] "RemoveContainer" containerID="f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.594567 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmqjm"] Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.606353 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xmqjm"] Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.621270 4990 scope.go:117] "RemoveContainer" containerID="47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.657683 4990 scope.go:117] "RemoveContainer" containerID="d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975" Oct 03 12:19:29 crc kubenswrapper[4990]: E1003 12:19:29.658238 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975\": container with ID starting with d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975 not found: ID does not exist" containerID="d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.658299 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975"} err="failed to get container status \"d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975\": rpc error: code = NotFound desc = could not find container \"d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975\": container with ID starting with d501fd22b6fb2c7394312ff3e0811418fe8c7912214dce3ac35850be2fcf4975 not found: ID does not exist" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.658337 4990 scope.go:117] "RemoveContainer" containerID="f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41" Oct 03 12:19:29 crc kubenswrapper[4990]: E1003 12:19:29.660269 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41\": container with ID starting with f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41 not found: ID does not exist" containerID="f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.660296 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41"} err="failed to get container status \"f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41\": rpc error: code = NotFound desc = could not find container \"f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41\": container with ID starting with f1936942a59117634aa07f4e20292f577b49028cd6d48378d7ea327b69753f41 not found: ID does not exist" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.660319 4990 scope.go:117] "RemoveContainer" containerID="47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc" Oct 03 12:19:29 crc kubenswrapper[4990]: E1003 12:19:29.660624 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc\": container with ID starting with 47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc not found: ID does not exist" containerID="47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc" Oct 03 12:19:29 crc kubenswrapper[4990]: I1003 12:19:29.660651 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc"} err="failed to get container status \"47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc\": rpc error: code = NotFound desc = could not find container \"47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc\": container with ID starting with 47f9569ab2442586ccc385a5caa7cfef5737bb49e09daeb30d8ad6c33d54f9bc not found: ID does not exist" Oct 03 12:19:30 crc kubenswrapper[4990]: I1003 12:19:30.885497 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" path="/var/lib/kubelet/pods/6cb448d7-b091-44b0-8db2-fa222f3f5ea2/volumes" Oct 03 12:19:38 crc kubenswrapper[4990]: I1003 12:19:38.872155 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:19:38 crc kubenswrapper[4990]: E1003 12:19:38.873645 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:19:49 crc kubenswrapper[4990]: I1003 12:19:49.872670 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:19:49 crc kubenswrapper[4990]: E1003 12:19:49.874647 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:20:03 crc kubenswrapper[4990]: I1003 12:20:03.872731 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:20:03 crc kubenswrapper[4990]: E1003 12:20:03.873895 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:20:17 crc kubenswrapper[4990]: I1003 12:20:17.873199 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:20:17 crc kubenswrapper[4990]: E1003 12:20:17.873997 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:20:31 crc kubenswrapper[4990]: I1003 12:20:31.873005 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:20:31 crc kubenswrapper[4990]: E1003 12:20:31.874301 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:20:45 crc kubenswrapper[4990]: I1003 12:20:45.873327 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:20:45 crc kubenswrapper[4990]: E1003 12:20:45.874284 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:20:57 crc kubenswrapper[4990]: I1003 12:20:57.873552 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:20:57 crc kubenswrapper[4990]: E1003 12:20:57.874852 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:21:11 crc kubenswrapper[4990]: I1003 12:21:11.873138 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:21:11 crc kubenswrapper[4990]: E1003 12:21:11.874648 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.255912 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f6whq"] Oct 03 12:21:12 crc kubenswrapper[4990]: E1003 12:21:12.256792 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerName="extract-content" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.256813 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerName="extract-content" Oct 03 12:21:12 crc kubenswrapper[4990]: E1003 12:21:12.256877 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerName="extract-utilities" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.256887 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerName="extract-utilities" Oct 03 12:21:12 crc kubenswrapper[4990]: E1003 12:21:12.256905 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerName="registry-server" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.256914 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerName="registry-server" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.257237 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb448d7-b091-44b0-8db2-fa222f3f5ea2" containerName="registry-server" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.259309 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.269627 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6whq"] Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.299305 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-catalog-content\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.299413 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-utilities\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.299525 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjtz\" (UniqueName: \"kubernetes.io/projected/ff360305-cd2f-4d46-bc49-422b2e0c42ed-kube-api-access-7xjtz\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.400839 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-utilities\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.401008 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjtz\" (UniqueName: \"kubernetes.io/projected/ff360305-cd2f-4d46-bc49-422b2e0c42ed-kube-api-access-7xjtz\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.401065 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-catalog-content\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.401336 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-utilities\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.401418 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-catalog-content\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.419234 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjtz\" (UniqueName: \"kubernetes.io/projected/ff360305-cd2f-4d46-bc49-422b2e0c42ed-kube-api-access-7xjtz\") pod \"redhat-marketplace-f6whq\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:12 crc kubenswrapper[4990]: I1003 12:21:12.582016 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:13 crc kubenswrapper[4990]: I1003 12:21:13.055157 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6whq"] Oct 03 12:21:13 crc kubenswrapper[4990]: I1003 12:21:13.866785 4990 generic.go:334] "Generic (PLEG): container finished" podID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerID="36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e" exitCode=0 Oct 03 12:21:13 crc kubenswrapper[4990]: I1003 12:21:13.867187 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6whq" event={"ID":"ff360305-cd2f-4d46-bc49-422b2e0c42ed","Type":"ContainerDied","Data":"36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e"} Oct 03 12:21:13 crc kubenswrapper[4990]: I1003 12:21:13.867242 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6whq" event={"ID":"ff360305-cd2f-4d46-bc49-422b2e0c42ed","Type":"ContainerStarted","Data":"fa01389212e0709727e284d4fce3bdddf1f0353188641e892b3126864fc473da"} Oct 03 12:21:15 crc kubenswrapper[4990]: I1003 12:21:15.902670 4990 generic.go:334] "Generic (PLEG): container finished" podID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerID="c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6" exitCode=0 Oct 03 12:21:15 crc kubenswrapper[4990]: I1003 12:21:15.902731 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6whq" event={"ID":"ff360305-cd2f-4d46-bc49-422b2e0c42ed","Type":"ContainerDied","Data":"c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6"} Oct 03 12:21:16 crc kubenswrapper[4990]: I1003 12:21:16.917480 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6whq" event={"ID":"ff360305-cd2f-4d46-bc49-422b2e0c42ed","Type":"ContainerStarted","Data":"778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986"} Oct 03 12:21:16 crc kubenswrapper[4990]: I1003 12:21:16.948587 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f6whq" podStartSLOduration=2.453298552 podStartE2EDuration="4.948564449s" podCreationTimestamp="2025-10-03 12:21:12 +0000 UTC" firstStartedPulling="2025-10-03 12:21:13.870559805 +0000 UTC m=+9455.667191702" lastFinishedPulling="2025-10-03 12:21:16.365825732 +0000 UTC m=+9458.162457599" observedRunningTime="2025-10-03 12:21:16.93777634 +0000 UTC m=+9458.734408217" watchObservedRunningTime="2025-10-03 12:21:16.948564449 +0000 UTC m=+9458.745196316" Oct 03 12:21:22 crc kubenswrapper[4990]: I1003 12:21:22.582441 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:22 crc kubenswrapper[4990]: I1003 12:21:22.583101 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:22 crc kubenswrapper[4990]: I1003 12:21:22.660130 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:23 crc kubenswrapper[4990]: I1003 12:21:23.047740 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:23 crc kubenswrapper[4990]: I1003 12:21:23.114679 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6whq"] Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.009996 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f6whq" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerName="registry-server" containerID="cri-o://778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986" gracePeriod=2 Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.572463 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.710216 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xjtz\" (UniqueName: \"kubernetes.io/projected/ff360305-cd2f-4d46-bc49-422b2e0c42ed-kube-api-access-7xjtz\") pod \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.710302 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-utilities\") pod \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.710564 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-catalog-content\") pod \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\" (UID: \"ff360305-cd2f-4d46-bc49-422b2e0c42ed\") " Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.711281 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-utilities" (OuterVolumeSpecName: "utilities") pod "ff360305-cd2f-4d46-bc49-422b2e0c42ed" (UID: "ff360305-cd2f-4d46-bc49-422b2e0c42ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.711479 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.715266 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff360305-cd2f-4d46-bc49-422b2e0c42ed-kube-api-access-7xjtz" (OuterVolumeSpecName: "kube-api-access-7xjtz") pod "ff360305-cd2f-4d46-bc49-422b2e0c42ed" (UID: "ff360305-cd2f-4d46-bc49-422b2e0c42ed"). InnerVolumeSpecName "kube-api-access-7xjtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.724183 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff360305-cd2f-4d46-bc49-422b2e0c42ed" (UID: "ff360305-cd2f-4d46-bc49-422b2e0c42ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.813768 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xjtz\" (UniqueName: \"kubernetes.io/projected/ff360305-cd2f-4d46-bc49-422b2e0c42ed-kube-api-access-7xjtz\") on node \"crc\" DevicePath \"\"" Oct 03 12:21:25 crc kubenswrapper[4990]: I1003 12:21:25.813798 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff360305-cd2f-4d46-bc49-422b2e0c42ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.026194 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6whq" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.026276 4990 generic.go:334] "Generic (PLEG): container finished" podID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerID="778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986" exitCode=0 Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.026312 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6whq" event={"ID":"ff360305-cd2f-4d46-bc49-422b2e0c42ed","Type":"ContainerDied","Data":"778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986"} Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.026604 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6whq" event={"ID":"ff360305-cd2f-4d46-bc49-422b2e0c42ed","Type":"ContainerDied","Data":"fa01389212e0709727e284d4fce3bdddf1f0353188641e892b3126864fc473da"} Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.026635 4990 scope.go:117] "RemoveContainer" containerID="778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.054807 4990 scope.go:117] "RemoveContainer" containerID="c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.074544 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6whq"] Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.093468 4990 scope.go:117] "RemoveContainer" containerID="36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.096855 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6whq"] Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.145629 4990 scope.go:117] "RemoveContainer" containerID="778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986" Oct 03 12:21:26 crc kubenswrapper[4990]: E1003 12:21:26.146107 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986\": container with ID starting with 778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986 not found: ID does not exist" containerID="778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.146172 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986"} err="failed to get container status \"778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986\": rpc error: code = NotFound desc = could not find container \"778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986\": container with ID starting with 778585c60180fc96e4444e3c8a5b90465a5bf886a7606f4216c6345c51e5f986 not found: ID does not exist" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.146205 4990 scope.go:117] "RemoveContainer" containerID="c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6" Oct 03 12:21:26 crc kubenswrapper[4990]: E1003 12:21:26.146493 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6\": container with ID starting with c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6 not found: ID does not exist" containerID="c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.146566 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6"} err="failed to get container status \"c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6\": rpc error: code = NotFound desc = could not find container \"c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6\": container with ID starting with c79c623dbd0c1b74b5c0ea0b1f303f3d0afc63fa7e6c726e3a651ba807ff61e6 not found: ID does not exist" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.146586 4990 scope.go:117] "RemoveContainer" containerID="36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e" Oct 03 12:21:26 crc kubenswrapper[4990]: E1003 12:21:26.146861 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e\": container with ID starting with 36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e not found: ID does not exist" containerID="36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.146889 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e"} err="failed to get container status \"36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e\": rpc error: code = NotFound desc = could not find container \"36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e\": container with ID starting with 36d39ff8b446eca42a8c70069405b4e2073dc46c39e3aa521135b712b664212e not found: ID does not exist" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.872448 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:21:26 crc kubenswrapper[4990]: I1003 12:21:26.887439 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" path="/var/lib/kubelet/pods/ff360305-cd2f-4d46-bc49-422b2e0c42ed/volumes" Oct 03 12:21:28 crc kubenswrapper[4990]: I1003 12:21:28.060108 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"c393c94c73e4b3f716fa029f83abeacbd9cef0474370be6c770035d13ef9b979"} Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.847101 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b6fw9"] Oct 03 12:21:50 crc kubenswrapper[4990]: E1003 12:21:50.849104 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerName="extract-content" Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.849127 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerName="extract-content" Oct 03 12:21:50 crc kubenswrapper[4990]: E1003 12:21:50.849150 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerName="extract-utilities" Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.849158 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerName="extract-utilities" Oct 03 12:21:50 crc kubenswrapper[4990]: E1003 12:21:50.849184 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerName="registry-server" Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.849191 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerName="registry-server" Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.849495 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff360305-cd2f-4d46-bc49-422b2e0c42ed" containerName="registry-server" Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.852335 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.892034 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6fw9"] Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.945047 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-catalog-content\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.945174 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8wjh\" (UniqueName: \"kubernetes.io/projected/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-kube-api-access-f8wjh\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:50 crc kubenswrapper[4990]: I1003 12:21:50.945366 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-utilities\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:51 crc kubenswrapper[4990]: I1003 12:21:51.047486 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8wjh\" (UniqueName: \"kubernetes.io/projected/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-kube-api-access-f8wjh\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:51 crc kubenswrapper[4990]: I1003 12:21:51.047668 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-utilities\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:51 crc kubenswrapper[4990]: I1003 12:21:51.047734 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-catalog-content\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:51 crc kubenswrapper[4990]: I1003 12:21:51.048585 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-utilities\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:51 crc kubenswrapper[4990]: I1003 12:21:51.048891 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-catalog-content\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:51 crc kubenswrapper[4990]: I1003 12:21:51.074587 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8wjh\" (UniqueName: \"kubernetes.io/projected/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-kube-api-access-f8wjh\") pod \"redhat-operators-b6fw9\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:51 crc kubenswrapper[4990]: I1003 12:21:51.172604 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:21:51 crc kubenswrapper[4990]: I1003 12:21:51.690074 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b6fw9"] Oct 03 12:21:52 crc kubenswrapper[4990]: I1003 12:21:52.373958 4990 generic.go:334] "Generic (PLEG): container finished" podID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerID="99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd" exitCode=0 Oct 03 12:21:52 crc kubenswrapper[4990]: I1003 12:21:52.374010 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6fw9" event={"ID":"a060f7f0-2be8-4b87-bbef-5e52fc6b708d","Type":"ContainerDied","Data":"99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd"} Oct 03 12:21:52 crc kubenswrapper[4990]: I1003 12:21:52.374040 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6fw9" event={"ID":"a060f7f0-2be8-4b87-bbef-5e52fc6b708d","Type":"ContainerStarted","Data":"b63fcfc7fb9dcebee9e43eaea13eb89a9d94d5a9e280bddcfde6b990a34293fb"} Oct 03 12:21:54 crc kubenswrapper[4990]: I1003 12:21:54.404554 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6fw9" event={"ID":"a060f7f0-2be8-4b87-bbef-5e52fc6b708d","Type":"ContainerStarted","Data":"2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e"} Oct 03 12:22:01 crc kubenswrapper[4990]: E1003 12:22:01.321378 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda060f7f0_2be8_4b87_bbef_5e52fc6b708d.slice/crio-conmon-2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e.scope\": RecentStats: unable to find data in memory cache]" Oct 03 12:22:01 crc kubenswrapper[4990]: I1003 12:22:01.481402 4990 generic.go:334] "Generic (PLEG): container finished" podID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerID="2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e" exitCode=0 Oct 03 12:22:01 crc kubenswrapper[4990]: I1003 12:22:01.481725 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6fw9" event={"ID":"a060f7f0-2be8-4b87-bbef-5e52fc6b708d","Type":"ContainerDied","Data":"2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e"} Oct 03 12:22:03 crc kubenswrapper[4990]: I1003 12:22:03.508682 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6fw9" event={"ID":"a060f7f0-2be8-4b87-bbef-5e52fc6b708d","Type":"ContainerStarted","Data":"b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d"} Oct 03 12:22:03 crc kubenswrapper[4990]: I1003 12:22:03.570594 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b6fw9" podStartSLOduration=3.686322151 podStartE2EDuration="13.570568907s" podCreationTimestamp="2025-10-03 12:21:50 +0000 UTC" firstStartedPulling="2025-10-03 12:21:52.376067282 +0000 UTC m=+9494.172699139" lastFinishedPulling="2025-10-03 12:22:02.260314038 +0000 UTC m=+9504.056945895" observedRunningTime="2025-10-03 12:22:03.545627712 +0000 UTC m=+9505.342259589" watchObservedRunningTime="2025-10-03 12:22:03.570568907 +0000 UTC m=+9505.367200764" Oct 03 12:22:11 crc kubenswrapper[4990]: I1003 12:22:11.172811 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:22:11 crc kubenswrapper[4990]: I1003 12:22:11.173588 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:22:12 crc kubenswrapper[4990]: I1003 12:22:12.941847 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b6fw9" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="registry-server" probeResult="failure" output=< Oct 03 12:22:12 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 12:22:12 crc kubenswrapper[4990]: > Oct 03 12:22:21 crc kubenswrapper[4990]: I1003 12:22:21.236438 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:22:21 crc kubenswrapper[4990]: I1003 12:22:21.330638 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:22:22 crc kubenswrapper[4990]: I1003 12:22:22.041224 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6fw9"] Oct 03 12:22:22 crc kubenswrapper[4990]: I1003 12:22:22.730975 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b6fw9" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="registry-server" containerID="cri-o://b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d" gracePeriod=2 Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.240073 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.364929 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8wjh\" (UniqueName: \"kubernetes.io/projected/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-kube-api-access-f8wjh\") pod \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.365073 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-catalog-content\") pod \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.365267 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-utilities\") pod \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\" (UID: \"a060f7f0-2be8-4b87-bbef-5e52fc6b708d\") " Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.365955 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-utilities" (OuterVolumeSpecName: "utilities") pod "a060f7f0-2be8-4b87-bbef-5e52fc6b708d" (UID: "a060f7f0-2be8-4b87-bbef-5e52fc6b708d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.373675 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-kube-api-access-f8wjh" (OuterVolumeSpecName: "kube-api-access-f8wjh") pod "a060f7f0-2be8-4b87-bbef-5e52fc6b708d" (UID: "a060f7f0-2be8-4b87-bbef-5e52fc6b708d"). InnerVolumeSpecName "kube-api-access-f8wjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.456848 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a060f7f0-2be8-4b87-bbef-5e52fc6b708d" (UID: "a060f7f0-2be8-4b87-bbef-5e52fc6b708d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.468073 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.468121 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.468141 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8wjh\" (UniqueName: \"kubernetes.io/projected/a060f7f0-2be8-4b87-bbef-5e52fc6b708d-kube-api-access-f8wjh\") on node \"crc\" DevicePath \"\"" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.746496 4990 generic.go:334] "Generic (PLEG): container finished" podID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerID="b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d" exitCode=0 Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.746568 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b6fw9" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.746570 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6fw9" event={"ID":"a060f7f0-2be8-4b87-bbef-5e52fc6b708d","Type":"ContainerDied","Data":"b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d"} Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.746659 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b6fw9" event={"ID":"a060f7f0-2be8-4b87-bbef-5e52fc6b708d","Type":"ContainerDied","Data":"b63fcfc7fb9dcebee9e43eaea13eb89a9d94d5a9e280bddcfde6b990a34293fb"} Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.746683 4990 scope.go:117] "RemoveContainer" containerID="b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.782570 4990 scope.go:117] "RemoveContainer" containerID="2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.802235 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b6fw9"] Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.818886 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b6fw9"] Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.824094 4990 scope.go:117] "RemoveContainer" containerID="99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.876821 4990 scope.go:117] "RemoveContainer" containerID="b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d" Oct 03 12:22:23 crc kubenswrapper[4990]: E1003 12:22:23.877681 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d\": container with ID starting with b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d not found: ID does not exist" containerID="b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.877722 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d"} err="failed to get container status \"b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d\": rpc error: code = NotFound desc = could not find container \"b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d\": container with ID starting with b9455e965751273c5b60c3be8f523e5d2b2ff4d2a3440a2957dc20201f3b682d not found: ID does not exist" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.877756 4990 scope.go:117] "RemoveContainer" containerID="2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e" Oct 03 12:22:23 crc kubenswrapper[4990]: E1003 12:22:23.878209 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e\": container with ID starting with 2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e not found: ID does not exist" containerID="2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.878270 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e"} err="failed to get container status \"2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e\": rpc error: code = NotFound desc = could not find container \"2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e\": container with ID starting with 2c92e188974824f13c5173eda0f3e02c9faf1d4217d85825528fb376b61a357e not found: ID does not exist" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.878317 4990 scope.go:117] "RemoveContainer" containerID="99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd" Oct 03 12:22:23 crc kubenswrapper[4990]: E1003 12:22:23.878833 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd\": container with ID starting with 99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd not found: ID does not exist" containerID="99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd" Oct 03 12:22:23 crc kubenswrapper[4990]: I1003 12:22:23.878868 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd"} err="failed to get container status \"99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd\": rpc error: code = NotFound desc = could not find container \"99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd\": container with ID starting with 99a4b44bc890b4ee3436afeeacacc17cb99be8a6a580f98e1d3e1dc1508092fd not found: ID does not exist" Oct 03 12:22:24 crc kubenswrapper[4990]: I1003 12:22:24.895715 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" path="/var/lib/kubelet/pods/a060f7f0-2be8-4b87-bbef-5e52fc6b708d/volumes" Oct 03 12:22:49 crc kubenswrapper[4990]: I1003 12:22:49.033116 4990 generic.go:334] "Generic (PLEG): container finished" podID="11994fd9-0d27-4cb9-9871-40e9416961fd" containerID="017c147b63986930061cc502d1c6c1058d129f65ce2f567b32f3776e9939fde5" exitCode=0 Oct 03 12:22:49 crc kubenswrapper[4990]: I1003 12:22:49.033350 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" event={"ID":"11994fd9-0d27-4cb9-9871-40e9416961fd","Type":"ContainerDied","Data":"017c147b63986930061cc502d1c6c1058d129f65ce2f567b32f3776e9939fde5"} Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.529162 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.636307 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-inventory\") pod \"11994fd9-0d27-4cb9-9871-40e9416961fd\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.636365 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-ssh-key\") pod \"11994fd9-0d27-4cb9-9871-40e9416961fd\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.636458 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btmtk\" (UniqueName: \"kubernetes.io/projected/11994fd9-0d27-4cb9-9871-40e9416961fd-kube-api-access-btmtk\") pod \"11994fd9-0d27-4cb9-9871-40e9416961fd\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.636501 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-combined-ca-bundle\") pod \"11994fd9-0d27-4cb9-9871-40e9416961fd\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.636562 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-agent-neutron-config-0\") pod \"11994fd9-0d27-4cb9-9871-40e9416961fd\" (UID: \"11994fd9-0d27-4cb9-9871-40e9416961fd\") " Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.642033 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "11994fd9-0d27-4cb9-9871-40e9416961fd" (UID: "11994fd9-0d27-4cb9-9871-40e9416961fd"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.642669 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11994fd9-0d27-4cb9-9871-40e9416961fd-kube-api-access-btmtk" (OuterVolumeSpecName: "kube-api-access-btmtk") pod "11994fd9-0d27-4cb9-9871-40e9416961fd" (UID: "11994fd9-0d27-4cb9-9871-40e9416961fd"). InnerVolumeSpecName "kube-api-access-btmtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.667732 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "11994fd9-0d27-4cb9-9871-40e9416961fd" (UID: "11994fd9-0d27-4cb9-9871-40e9416961fd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.667905 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "11994fd9-0d27-4cb9-9871-40e9416961fd" (UID: "11994fd9-0d27-4cb9-9871-40e9416961fd"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.670646 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-inventory" (OuterVolumeSpecName: "inventory") pod "11994fd9-0d27-4cb9-9871-40e9416961fd" (UID: "11994fd9-0d27-4cb9-9871-40e9416961fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.739761 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.739803 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.739817 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btmtk\" (UniqueName: \"kubernetes.io/projected/11994fd9-0d27-4cb9-9871-40e9416961fd-kube-api-access-btmtk\") on node \"crc\" DevicePath \"\"" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.739832 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:22:50 crc kubenswrapper[4990]: I1003 12:22:50.739845 4990 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11994fd9-0d27-4cb9-9871-40e9416961fd-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:22:51 crc kubenswrapper[4990]: I1003 12:22:51.064451 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" event={"ID":"11994fd9-0d27-4cb9-9871-40e9416961fd","Type":"ContainerDied","Data":"31cb3a47610a4e1aaae9f8c151f67f4ab0dd76da40dd377417fa93372780534e"} Oct 03 12:22:51 crc kubenswrapper[4990]: I1003 12:22:51.064778 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31cb3a47610a4e1aaae9f8c151f67f4ab0dd76da40dd377417fa93372780534e" Oct 03 12:22:51 crc kubenswrapper[4990]: I1003 12:22:51.064505 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-d77pm" Oct 03 12:23:15 crc kubenswrapper[4990]: I1003 12:23:15.903994 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 12:23:15 crc kubenswrapper[4990]: I1003 12:23:15.904871 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a33bb044-a557-41e0-93be-9160096406a3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8" gracePeriod=30 Oct 03 12:23:16 crc kubenswrapper[4990]: I1003 12:23:16.957535 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 12:23:16 crc kubenswrapper[4990]: I1003 12:23:16.958368 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="047f6c42-6806-4dde-b829-7748a2560e2a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://80ff68c4a3e79b5d970f66d8b340c581812ccd94c0e23b62621d70e3ce5516d4" gracePeriod=30 Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.181691 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.181936 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f631f859-cbf9-48c9-9555-147dcce70e07" containerName="nova-scheduler-scheduler" containerID="cri-o://e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13" gracePeriod=30 Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.190491 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.198572 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.198781 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-log" containerID="cri-o://a3f13f83759e227f9908ec6a6abbb9d6dc52019f21189f46ced9e01e967d412f" gracePeriod=30 Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.199196 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-metadata" containerID="cri-o://7a08f708df8c8f2b7ed6869a02da58608e553958961a8fa9b16e21ee7ec83184" gracePeriod=30 Oct 03 12:23:17 crc kubenswrapper[4990]: E1003 12:23:17.338346 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 12:23:17 crc kubenswrapper[4990]: E1003 12:23:17.343735 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 12:23:17 crc kubenswrapper[4990]: E1003 12:23:17.345751 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 12:23:17 crc kubenswrapper[4990]: E1003 12:23:17.345954 4990 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a33bb044-a557-41e0-93be-9160096406a3" containerName="nova-cell0-conductor-conductor" Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.389911 4990 generic.go:334] "Generic (PLEG): container finished" podID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerID="a3f13f83759e227f9908ec6a6abbb9d6dc52019f21189f46ced9e01e967d412f" exitCode=143 Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.390083 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-log" containerID="cri-o://39d718c35f7c6da257a68938ac8e2401fabb6923863e7a3d489d25de81723bab" gracePeriod=30 Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.390297 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f406cae-9e16-4ebf-8a12-4585b177eb9d","Type":"ContainerDied","Data":"a3f13f83759e227f9908ec6a6abbb9d6dc52019f21189f46ced9e01e967d412f"} Oct 03 12:23:17 crc kubenswrapper[4990]: I1003 12:23:17.390597 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-api" containerID="cri-o://438fae3c516281efd9b6614bc2d17cbaf00a08b935c2a1bfa0583fc3ac27cec2" gracePeriod=30 Oct 03 12:23:18 crc kubenswrapper[4990]: I1003 12:23:18.401289 4990 generic.go:334] "Generic (PLEG): container finished" podID="c9929494-48ad-4171-89f4-654931af17a8" containerID="39d718c35f7c6da257a68938ac8e2401fabb6923863e7a3d489d25de81723bab" exitCode=143 Oct 03 12:23:18 crc kubenswrapper[4990]: I1003 12:23:18.401414 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9929494-48ad-4171-89f4-654931af17a8","Type":"ContainerDied","Data":"39d718c35f7c6da257a68938ac8e2401fabb6923863e7a3d489d25de81723bab"} Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.064574 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13 is running failed: container process not found" containerID="e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.065167 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13 is running failed: container process not found" containerID="e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.065578 4990 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13 is running failed: container process not found" containerID="e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.065612 4990 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f631f859-cbf9-48c9-9555-147dcce70e07" containerName="nova-scheduler-scheduler" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.204757 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.295284 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-config-data\") pod \"f631f859-cbf9-48c9-9555-147dcce70e07\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.295421 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-combined-ca-bundle\") pod \"f631f859-cbf9-48c9-9555-147dcce70e07\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.295616 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65c46\" (UniqueName: \"kubernetes.io/projected/f631f859-cbf9-48c9-9555-147dcce70e07-kube-api-access-65c46\") pod \"f631f859-cbf9-48c9-9555-147dcce70e07\" (UID: \"f631f859-cbf9-48c9-9555-147dcce70e07\") " Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.317034 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f631f859-cbf9-48c9-9555-147dcce70e07-kube-api-access-65c46" (OuterVolumeSpecName: "kube-api-access-65c46") pod "f631f859-cbf9-48c9-9555-147dcce70e07" (UID: "f631f859-cbf9-48c9-9555-147dcce70e07"). InnerVolumeSpecName "kube-api-access-65c46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.365498 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f631f859-cbf9-48c9-9555-147dcce70e07" (UID: "f631f859-cbf9-48c9-9555-147dcce70e07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.384790 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-config-data" (OuterVolumeSpecName: "config-data") pod "f631f859-cbf9-48c9-9555-147dcce70e07" (UID: "f631f859-cbf9-48c9-9555-147dcce70e07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.399658 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.399702 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f631f859-cbf9-48c9-9555-147dcce70e07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.399716 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65c46\" (UniqueName: \"kubernetes.io/projected/f631f859-cbf9-48c9-9555-147dcce70e07-kube-api-access-65c46\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.445341 4990 generic.go:334] "Generic (PLEG): container finished" podID="047f6c42-6806-4dde-b829-7748a2560e2a" containerID="80ff68c4a3e79b5d970f66d8b340c581812ccd94c0e23b62621d70e3ce5516d4" exitCode=0 Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.445433 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"047f6c42-6806-4dde-b829-7748a2560e2a","Type":"ContainerDied","Data":"80ff68c4a3e79b5d970f66d8b340c581812ccd94c0e23b62621d70e3ce5516d4"} Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.451591 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"047f6c42-6806-4dde-b829-7748a2560e2a","Type":"ContainerDied","Data":"b6841bb5358cb843fd785b894a0f84330190e18075c0a338fbba068186d5affb"} Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.451633 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6841bb5358cb843fd785b894a0f84330190e18075c0a338fbba068186d5affb" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.458063 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.484405 4990 generic.go:334] "Generic (PLEG): container finished" podID="f631f859-cbf9-48c9-9555-147dcce70e07" containerID="e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13" exitCode=0 Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.484460 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f631f859-cbf9-48c9-9555-147dcce70e07","Type":"ContainerDied","Data":"e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13"} Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.484491 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f631f859-cbf9-48c9-9555-147dcce70e07","Type":"ContainerDied","Data":"11fccaecfb7ebd40bc1c94f8e312909e9c1051d0855e4ccfa625a5024c8280e5"} Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.484513 4990 scope.go:117] "RemoveContainer" containerID="e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.484688 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.569705 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.589881 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.607604 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.608105 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047f6c42-6806-4dde-b829-7748a2560e2a" containerName="nova-cell1-conductor-conductor" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608119 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="047f6c42-6806-4dde-b829-7748a2560e2a" containerName="nova-cell1-conductor-conductor" Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.608140 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="extract-content" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608146 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="extract-content" Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.608161 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="registry-server" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608168 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="registry-server" Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.608180 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11994fd9-0d27-4cb9-9871-40e9416961fd" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608188 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="11994fd9-0d27-4cb9-9871-40e9416961fd" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.608198 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="extract-utilities" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608204 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="extract-utilities" Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.608223 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f631f859-cbf9-48c9-9555-147dcce70e07" containerName="nova-scheduler-scheduler" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608229 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f631f859-cbf9-48c9-9555-147dcce70e07" containerName="nova-scheduler-scheduler" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608471 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a060f7f0-2be8-4b87-bbef-5e52fc6b708d" containerName="registry-server" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608497 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="11994fd9-0d27-4cb9-9871-40e9416961fd" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608507 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="047f6c42-6806-4dde-b829-7748a2560e2a" containerName="nova-cell1-conductor-conductor" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.608518 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f631f859-cbf9-48c9-9555-147dcce70e07" containerName="nova-scheduler-scheduler" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.609261 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.615736 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.626725 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.628485 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thhnf\" (UniqueName: \"kubernetes.io/projected/047f6c42-6806-4dde-b829-7748a2560e2a-kube-api-access-thhnf\") pod \"047f6c42-6806-4dde-b829-7748a2560e2a\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.628666 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-config-data\") pod \"047f6c42-6806-4dde-b829-7748a2560e2a\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.628833 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-combined-ca-bundle\") pod \"047f6c42-6806-4dde-b829-7748a2560e2a\" (UID: \"047f6c42-6806-4dde-b829-7748a2560e2a\") " Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.642775 4990 scope.go:117] "RemoveContainer" containerID="e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13" Oct 03 12:23:19 crc kubenswrapper[4990]: E1003 12:23:19.643339 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13\": container with ID starting with e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13 not found: ID does not exist" containerID="e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.643382 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13"} err="failed to get container status \"e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13\": rpc error: code = NotFound desc = could not find container \"e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13\": container with ID starting with e1894dc163dcf288fa8f6d83c7dafeb31723a5198cac5b6ec2519f726c4fea13 not found: ID does not exist" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.664447 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047f6c42-6806-4dde-b829-7748a2560e2a-kube-api-access-thhnf" (OuterVolumeSpecName: "kube-api-access-thhnf") pod "047f6c42-6806-4dde-b829-7748a2560e2a" (UID: "047f6c42-6806-4dde-b829-7748a2560e2a"). InnerVolumeSpecName "kube-api-access-thhnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.718184 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "047f6c42-6806-4dde-b829-7748a2560e2a" (UID: "047f6c42-6806-4dde-b829-7748a2560e2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.728766 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-config-data" (OuterVolumeSpecName: "config-data") pod "047f6c42-6806-4dde-b829-7748a2560e2a" (UID: "047f6c42-6806-4dde-b829-7748a2560e2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.733495 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989baf10-df5f-4e5d-a3af-0475029519e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.733595 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq4rr\" (UniqueName: \"kubernetes.io/projected/989baf10-df5f-4e5d-a3af-0475029519e5-kube-api-access-tq4rr\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.733726 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989baf10-df5f-4e5d-a3af-0475029519e5-config-data\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.733803 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.733824 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thhnf\" (UniqueName: \"kubernetes.io/projected/047f6c42-6806-4dde-b829-7748a2560e2a-kube-api-access-thhnf\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.733839 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f6c42-6806-4dde-b829-7748a2560e2a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.835921 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989baf10-df5f-4e5d-a3af-0475029519e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.835982 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq4rr\" (UniqueName: \"kubernetes.io/projected/989baf10-df5f-4e5d-a3af-0475029519e5-kube-api-access-tq4rr\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.836076 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989baf10-df5f-4e5d-a3af-0475029519e5-config-data\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.840473 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989baf10-df5f-4e5d-a3af-0475029519e5-config-data\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.840806 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989baf10-df5f-4e5d-a3af-0475029519e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:19 crc kubenswrapper[4990]: I1003 12:23:19.852636 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq4rr\" (UniqueName: \"kubernetes.io/projected/989baf10-df5f-4e5d-a3af-0475029519e5-kube-api-access-tq4rr\") pod \"nova-scheduler-0\" (UID: \"989baf10-df5f-4e5d-a3af-0475029519e5\") " pod="openstack/nova-scheduler-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.137929 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.452364 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": dial tcp 10.217.1.95:8775: connect: connection refused" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.453730 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.95:8775/\": dial tcp 10.217.1.95:8775: connect: connection refused" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.497351 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.552872 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.569426 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.586492 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.587964 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.590873 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.608604 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.739885 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.769743 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a982af33-ad1d-409f-86f4-eb132a83dfa8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.769817 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a982af33-ad1d-409f-86f4-eb132a83dfa8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.770144 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjfjb\" (UniqueName: \"kubernetes.io/projected/a982af33-ad1d-409f-86f4-eb132a83dfa8-kube-api-access-gjfjb\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.871871 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a982af33-ad1d-409f-86f4-eb132a83dfa8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.871942 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a982af33-ad1d-409f-86f4-eb132a83dfa8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.872031 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjfjb\" (UniqueName: \"kubernetes.io/projected/a982af33-ad1d-409f-86f4-eb132a83dfa8-kube-api-access-gjfjb\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.875875 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a982af33-ad1d-409f-86f4-eb132a83dfa8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.876603 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a982af33-ad1d-409f-86f4-eb132a83dfa8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.888611 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047f6c42-6806-4dde-b829-7748a2560e2a" path="/var/lib/kubelet/pods/047f6c42-6806-4dde-b829-7748a2560e2a/volumes" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.889318 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f631f859-cbf9-48c9-9555-147dcce70e07" path="/var/lib/kubelet/pods/f631f859-cbf9-48c9-9555-147dcce70e07/volumes" Oct 03 12:23:20 crc kubenswrapper[4990]: I1003 12:23:20.895475 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjfjb\" (UniqueName: \"kubernetes.io/projected/a982af33-ad1d-409f-86f4-eb132a83dfa8-kube-api-access-gjfjb\") pod \"nova-cell1-conductor-0\" (UID: \"a982af33-ad1d-409f-86f4-eb132a83dfa8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.013285 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.517507 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.520497 4990 generic.go:334] "Generic (PLEG): container finished" podID="c9929494-48ad-4171-89f4-654931af17a8" containerID="438fae3c516281efd9b6614bc2d17cbaf00a08b935c2a1bfa0583fc3ac27cec2" exitCode=0 Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.520620 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9929494-48ad-4171-89f4-654931af17a8","Type":"ContainerDied","Data":"438fae3c516281efd9b6614bc2d17cbaf00a08b935c2a1bfa0583fc3ac27cec2"} Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.530280 4990 generic.go:334] "Generic (PLEG): container finished" podID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerID="7a08f708df8c8f2b7ed6869a02da58608e553958961a8fa9b16e21ee7ec83184" exitCode=0 Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.530336 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f406cae-9e16-4ebf-8a12-4585b177eb9d","Type":"ContainerDied","Data":"7a08f708df8c8f2b7ed6869a02da58608e553958961a8fa9b16e21ee7ec83184"} Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.531650 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"989baf10-df5f-4e5d-a3af-0475029519e5","Type":"ContainerStarted","Data":"cbdeb95d8c9a85fe1b36128732919fbd95449e6a63b290a2929598a7e410880f"} Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.531673 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"989baf10-df5f-4e5d-a3af-0475029519e5","Type":"ContainerStarted","Data":"c17ee9ddef8c522a3c3b672762445dccf7bd52b6fd73cd7219eaa48e3671eeaf"} Oct 03 12:23:21 crc kubenswrapper[4990]: W1003 12:23:21.545231 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda982af33_ad1d_409f_86f4_eb132a83dfa8.slice/crio-4bdfb06684a30743633736ea913a6c9d2528c7c019ecf6449d9d0b9ca8cd6196 WatchSource:0}: Error finding container 4bdfb06684a30743633736ea913a6c9d2528c7c019ecf6449d9d0b9ca8cd6196: Status 404 returned error can't find the container with id 4bdfb06684a30743633736ea913a6c9d2528c7c019ecf6449d9d0b9ca8cd6196 Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.554669 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.588017 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.58799958 podStartE2EDuration="2.58799958s" podCreationTimestamp="2025-10-03 12:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:23:21.551985498 +0000 UTC m=+9583.348617355" watchObservedRunningTime="2025-10-03 12:23:21.58799958 +0000 UTC m=+9583.384631437" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.691663 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f406cae-9e16-4ebf-8a12-4585b177eb9d-logs\") pod \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.691986 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-combined-ca-bundle\") pod \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.692200 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-config-data\") pod \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.692221 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvllx\" (UniqueName: \"kubernetes.io/projected/9f406cae-9e16-4ebf-8a12-4585b177eb9d-kube-api-access-lvllx\") pod \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.692283 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-nova-metadata-tls-certs\") pod \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\" (UID: \"9f406cae-9e16-4ebf-8a12-4585b177eb9d\") " Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.694083 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f406cae-9e16-4ebf-8a12-4585b177eb9d-logs" (OuterVolumeSpecName: "logs") pod "9f406cae-9e16-4ebf-8a12-4585b177eb9d" (UID: "9f406cae-9e16-4ebf-8a12-4585b177eb9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.717897 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f406cae-9e16-4ebf-8a12-4585b177eb9d-kube-api-access-lvllx" (OuterVolumeSpecName: "kube-api-access-lvllx") pod "9f406cae-9e16-4ebf-8a12-4585b177eb9d" (UID: "9f406cae-9e16-4ebf-8a12-4585b177eb9d"). InnerVolumeSpecName "kube-api-access-lvllx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.754741 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-config-data" (OuterVolumeSpecName: "config-data") pod "9f406cae-9e16-4ebf-8a12-4585b177eb9d" (UID: "9f406cae-9e16-4ebf-8a12-4585b177eb9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.763911 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f406cae-9e16-4ebf-8a12-4585b177eb9d" (UID: "9f406cae-9e16-4ebf-8a12-4585b177eb9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.779772 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9f406cae-9e16-4ebf-8a12-4585b177eb9d" (UID: "9f406cae-9e16-4ebf-8a12-4585b177eb9d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.795351 4990 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.795391 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f406cae-9e16-4ebf-8a12-4585b177eb9d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.795405 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.795417 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f406cae-9e16-4ebf-8a12-4585b177eb9d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.795429 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvllx\" (UniqueName: \"kubernetes.io/projected/9f406cae-9e16-4ebf-8a12-4585b177eb9d-kube-api-access-lvllx\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.886373 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.999008 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7666\" (UniqueName: \"kubernetes.io/projected/c9929494-48ad-4171-89f4-654931af17a8-kube-api-access-n7666\") pod \"c9929494-48ad-4171-89f4-654931af17a8\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.999101 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9929494-48ad-4171-89f4-654931af17a8-logs\") pod \"c9929494-48ad-4171-89f4-654931af17a8\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.999601 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9929494-48ad-4171-89f4-654931af17a8-logs" (OuterVolumeSpecName: "logs") pod "c9929494-48ad-4171-89f4-654931af17a8" (UID: "c9929494-48ad-4171-89f4-654931af17a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.999670 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-public-tls-certs\") pod \"c9929494-48ad-4171-89f4-654931af17a8\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " Oct 03 12:23:21 crc kubenswrapper[4990]: I1003 12:23:21.999689 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-internal-tls-certs\") pod \"c9929494-48ad-4171-89f4-654931af17a8\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:21.999994 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-combined-ca-bundle\") pod \"c9929494-48ad-4171-89f4-654931af17a8\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.000149 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-config-data\") pod \"c9929494-48ad-4171-89f4-654931af17a8\" (UID: \"c9929494-48ad-4171-89f4-654931af17a8\") " Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.001116 4990 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9929494-48ad-4171-89f4-654931af17a8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.002801 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9929494-48ad-4171-89f4-654931af17a8-kube-api-access-n7666" (OuterVolumeSpecName: "kube-api-access-n7666") pod "c9929494-48ad-4171-89f4-654931af17a8" (UID: "c9929494-48ad-4171-89f4-654931af17a8"). InnerVolumeSpecName "kube-api-access-n7666". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.049854 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-config-data" (OuterVolumeSpecName: "config-data") pod "c9929494-48ad-4171-89f4-654931af17a8" (UID: "c9929494-48ad-4171-89f4-654931af17a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.052663 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9929494-48ad-4171-89f4-654931af17a8" (UID: "c9929494-48ad-4171-89f4-654931af17a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.076557 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c9929494-48ad-4171-89f4-654931af17a8" (UID: "c9929494-48ad-4171-89f4-654931af17a8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.092006 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9929494-48ad-4171-89f4-654931af17a8" (UID: "c9929494-48ad-4171-89f4-654931af17a8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.103211 4990 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.103246 4990 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.103255 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.103268 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9929494-48ad-4171-89f4-654931af17a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.103278 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7666\" (UniqueName: \"kubernetes.io/projected/c9929494-48ad-4171-89f4-654931af17a8-kube-api-access-n7666\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.335605 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.512402 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-combined-ca-bundle\") pod \"a33bb044-a557-41e0-93be-9160096406a3\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.512648 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vr7f\" (UniqueName: \"kubernetes.io/projected/a33bb044-a557-41e0-93be-9160096406a3-kube-api-access-6vr7f\") pod \"a33bb044-a557-41e0-93be-9160096406a3\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.512674 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-config-data\") pod \"a33bb044-a557-41e0-93be-9160096406a3\" (UID: \"a33bb044-a557-41e0-93be-9160096406a3\") " Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.518741 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33bb044-a557-41e0-93be-9160096406a3-kube-api-access-6vr7f" (OuterVolumeSpecName: "kube-api-access-6vr7f") pod "a33bb044-a557-41e0-93be-9160096406a3" (UID: "a33bb044-a557-41e0-93be-9160096406a3"). InnerVolumeSpecName "kube-api-access-6vr7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.540133 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a33bb044-a557-41e0-93be-9160096406a3" (UID: "a33bb044-a557-41e0-93be-9160096406a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.541352 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a982af33-ad1d-409f-86f4-eb132a83dfa8","Type":"ContainerStarted","Data":"2b3f0936f394c799cf251aaadc369f0dc47b8a88cbe3323a7025b134aef10b97"} Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.541393 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a982af33-ad1d-409f-86f4-eb132a83dfa8","Type":"ContainerStarted","Data":"4bdfb06684a30743633736ea913a6c9d2528c7c019ecf6449d9d0b9ca8cd6196"} Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.541426 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.543952 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9929494-48ad-4171-89f4-654931af17a8","Type":"ContainerDied","Data":"a156fdb9ab7ac6452eeea03f185dc12b0b980607e998108f470263da0ccd806f"} Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.543999 4990 scope.go:117] "RemoveContainer" containerID="438fae3c516281efd9b6614bc2d17cbaf00a08b935c2a1bfa0583fc3ac27cec2" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.543961 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.545780 4990 generic.go:334] "Generic (PLEG): container finished" podID="a33bb044-a557-41e0-93be-9160096406a3" containerID="c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8" exitCode=0 Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.545820 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a33bb044-a557-41e0-93be-9160096406a3","Type":"ContainerDied","Data":"c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8"} Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.545837 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a33bb044-a557-41e0-93be-9160096406a3","Type":"ContainerDied","Data":"805ca2768f75f6565093d75f167f6bae3cfb6528579fab31b3d4d2e2b2f4691d"} Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.545865 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.548305 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.548328 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f406cae-9e16-4ebf-8a12-4585b177eb9d","Type":"ContainerDied","Data":"49bde3b6aae47fcaca00ee2b2cf9cae9f828ed5ca5256cbb285864e22f801077"} Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.554850 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-config-data" (OuterVolumeSpecName: "config-data") pod "a33bb044-a557-41e0-93be-9160096406a3" (UID: "a33bb044-a557-41e0-93be-9160096406a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.561569 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.561552988 podStartE2EDuration="2.561552988s" podCreationTimestamp="2025-10-03 12:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:23:22.557612286 +0000 UTC m=+9584.354244143" watchObservedRunningTime="2025-10-03 12:23:22.561552988 +0000 UTC m=+9584.358184865" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.569907 4990 scope.go:117] "RemoveContainer" containerID="39d718c35f7c6da257a68938ac8e2401fabb6923863e7a3d489d25de81723bab" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.587345 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.600646 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.610373 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.618423 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vr7f\" (UniqueName: \"kubernetes.io/projected/a33bb044-a557-41e0-93be-9160096406a3-kube-api-access-6vr7f\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.618453 4990 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.618464 4990 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a33bb044-a557-41e0-93be-9160096406a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.619713 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.636576 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: E1003 12:23:22.637066 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-metadata" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637085 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-metadata" Oct 03 12:23:22 crc kubenswrapper[4990]: E1003 12:23:22.637122 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-log" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637130 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-log" Oct 03 12:23:22 crc kubenswrapper[4990]: E1003 12:23:22.637147 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-log" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637154 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-log" Oct 03 12:23:22 crc kubenswrapper[4990]: E1003 12:23:22.637163 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-api" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637170 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-api" Oct 03 12:23:22 crc kubenswrapper[4990]: E1003 12:23:22.637197 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33bb044-a557-41e0-93be-9160096406a3" containerName="nova-cell0-conductor-conductor" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637203 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33bb044-a557-41e0-93be-9160096406a3" containerName="nova-cell0-conductor-conductor" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637422 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-log" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637460 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" containerName="nova-metadata-metadata" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637477 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-api" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637494 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33bb044-a557-41e0-93be-9160096406a3" containerName="nova-cell0-conductor-conductor" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.637508 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9929494-48ad-4171-89f4-654931af17a8" containerName="nova-api-log" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.638977 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.641591 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.641873 4990 scope.go:117] "RemoveContainer" containerID="c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.645710 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.646075 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.650074 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.651711 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.660390 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.660546 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.660677 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.678492 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.703185 4990 scope.go:117] "RemoveContainer" containerID="c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8" Oct 03 12:23:22 crc kubenswrapper[4990]: E1003 12:23:22.703609 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8\": container with ID starting with c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8 not found: ID does not exist" containerID="c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.703640 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8"} err="failed to get container status \"c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8\": rpc error: code = NotFound desc = could not find container \"c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8\": container with ID starting with c807529d9263115e8d062f42655ee45780e54ec10896c20d1b73b51385466dd8 not found: ID does not exist" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.703661 4990 scope.go:117] "RemoveContainer" containerID="7a08f708df8c8f2b7ed6869a02da58608e553958961a8fa9b16e21ee7ec83184" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.747192 4990 scope.go:117] "RemoveContainer" containerID="a3f13f83759e227f9908ec6a6abbb9d6dc52019f21189f46ced9e01e967d412f" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.824599 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d6174c4-8040-4f21-873f-a508d9e83ce3-logs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.824659 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-config-data\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.824743 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.825172 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-logs\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.825222 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.825285 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.825337 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.825372 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlfk6\" (UniqueName: \"kubernetes.io/projected/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-kube-api-access-mlfk6\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.825455 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q7z4\" (UniqueName: \"kubernetes.io/projected/1d6174c4-8040-4f21-873f-a508d9e83ce3-kube-api-access-4q7z4\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.825497 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-config-data\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.825547 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.885668 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f406cae-9e16-4ebf-8a12-4585b177eb9d" path="/var/lib/kubelet/pods/9f406cae-9e16-4ebf-8a12-4585b177eb9d/volumes" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.886577 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9929494-48ad-4171-89f4-654931af17a8" path="/var/lib/kubelet/pods/c9929494-48ad-4171-89f4-654931af17a8/volumes" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.887390 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.891498 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.913929 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.916326 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.918592 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.926192 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927611 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927678 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927704 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlfk6\" (UniqueName: \"kubernetes.io/projected/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-kube-api-access-mlfk6\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927772 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q7z4\" (UniqueName: \"kubernetes.io/projected/1d6174c4-8040-4f21-873f-a508d9e83ce3-kube-api-access-4q7z4\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927803 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-config-data\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927825 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927859 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d6174c4-8040-4f21-873f-a508d9e83ce3-logs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927878 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-config-data\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927906 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927953 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-logs\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.927972 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.933620 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.934008 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-logs\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.934314 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d6174c4-8040-4f21-873f-a508d9e83ce3-logs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.940120 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-public-tls-certs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.949065 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-config-data\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.949751 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-config-data\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.950732 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.955382 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.960173 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d6174c4-8040-4f21-873f-a508d9e83ce3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.963832 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q7z4\" (UniqueName: \"kubernetes.io/projected/1d6174c4-8040-4f21-873f-a508d9e83ce3-kube-api-access-4q7z4\") pod \"nova-api-0\" (UID: \"1d6174c4-8040-4f21-873f-a508d9e83ce3\") " pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.965215 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlfk6\" (UniqueName: \"kubernetes.io/projected/33fda7a9-0a65-4e23-b1c8-0f74f21b9515-kube-api-access-mlfk6\") pod \"nova-metadata-0\" (UID: \"33fda7a9-0a65-4e23-b1c8-0f74f21b9515\") " pod="openstack/nova-metadata-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.967359 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 12:23:22 crc kubenswrapper[4990]: I1003 12:23:22.996764 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.030038 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcc49a5-24b8-4815-80b8-5d3edf557360-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.030725 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j977z\" (UniqueName: \"kubernetes.io/projected/fdcc49a5-24b8-4815-80b8-5d3edf557360-kube-api-access-j977z\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.030789 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcc49a5-24b8-4815-80b8-5d3edf557360-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.133516 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcc49a5-24b8-4815-80b8-5d3edf557360-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.133791 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j977z\" (UniqueName: \"kubernetes.io/projected/fdcc49a5-24b8-4815-80b8-5d3edf557360-kube-api-access-j977z\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.133864 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcc49a5-24b8-4815-80b8-5d3edf557360-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.148183 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcc49a5-24b8-4815-80b8-5d3edf557360-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.148487 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcc49a5-24b8-4815-80b8-5d3edf557360-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.160660 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j977z\" (UniqueName: \"kubernetes.io/projected/fdcc49a5-24b8-4815-80b8-5d3edf557360-kube-api-access-j977z\") pod \"nova-cell0-conductor-0\" (UID: \"fdcc49a5-24b8-4815-80b8-5d3edf557360\") " pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.385076 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.527040 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.537521 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 12:23:23 crc kubenswrapper[4990]: W1003 12:23:23.546193 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d6174c4_8040_4f21_873f_a508d9e83ce3.slice/crio-460ebbbf4cc9a6b10428c44c075aaaf0d3303e4f9be3da97c2aece46e552f260 WatchSource:0}: Error finding container 460ebbbf4cc9a6b10428c44c075aaaf0d3303e4f9be3da97c2aece46e552f260: Status 404 returned error can't find the container with id 460ebbbf4cc9a6b10428c44c075aaaf0d3303e4f9be3da97c2aece46e552f260 Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.596170 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33fda7a9-0a65-4e23-b1c8-0f74f21b9515","Type":"ContainerStarted","Data":"e0758a54d701809751f354cba5ef753c1f454a70a1369c71e4e62afe9a526bb3"} Oct 03 12:23:23 crc kubenswrapper[4990]: I1003 12:23:23.904286 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.623495 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d6174c4-8040-4f21-873f-a508d9e83ce3","Type":"ContainerStarted","Data":"72c9734ca67700e9391348794b53f0681583257777871fedf16c78d362c679e6"} Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.623808 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d6174c4-8040-4f21-873f-a508d9e83ce3","Type":"ContainerStarted","Data":"36b9a78338a79e90d1171f8acefa81d0be1a3fdcf97b148368fb8f54a0d5fc57"} Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.623823 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1d6174c4-8040-4f21-873f-a508d9e83ce3","Type":"ContainerStarted","Data":"460ebbbf4cc9a6b10428c44c075aaaf0d3303e4f9be3da97c2aece46e552f260"} Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.625279 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fdcc49a5-24b8-4815-80b8-5d3edf557360","Type":"ContainerStarted","Data":"bd72244d629ee29bed8321c019aec47b4c7c3447eb8be06d3d21bfeeaa4cd9dc"} Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.625315 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fdcc49a5-24b8-4815-80b8-5d3edf557360","Type":"ContainerStarted","Data":"af6747585a1bde7c69949ced4190536707988dca586b2305a1ce2c9915c85e40"} Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.625420 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.627431 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33fda7a9-0a65-4e23-b1c8-0f74f21b9515","Type":"ContainerStarted","Data":"a8f1f5b65f09766d6523a8f62f45a096149d0d8bc3a799b927ad1bae949d6bda"} Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.627471 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"33fda7a9-0a65-4e23-b1c8-0f74f21b9515","Type":"ContainerStarted","Data":"ae82b779961cb71b327f2a339bb153ec11301af312705c48697349dc328b6eb5"} Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.652150 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.652132665 podStartE2EDuration="2.652132665s" podCreationTimestamp="2025-10-03 12:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:23:24.644488477 +0000 UTC m=+9586.441120334" watchObservedRunningTime="2025-10-03 12:23:24.652132665 +0000 UTC m=+9586.448764522" Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.666744 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.666724273 podStartE2EDuration="2.666724273s" podCreationTimestamp="2025-10-03 12:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:23:24.662650907 +0000 UTC m=+9586.459282784" watchObservedRunningTime="2025-10-03 12:23:24.666724273 +0000 UTC m=+9586.463356130" Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.696091 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6960557019999998 podStartE2EDuration="2.696055702s" podCreationTimestamp="2025-10-03 12:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:23:24.685107658 +0000 UTC m=+9586.481739535" watchObservedRunningTime="2025-10-03 12:23:24.696055702 +0000 UTC m=+9586.492687559" Oct 03 12:23:24 crc kubenswrapper[4990]: I1003 12:23:24.889648 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33bb044-a557-41e0-93be-9160096406a3" path="/var/lib/kubelet/pods/a33bb044-a557-41e0-93be-9160096406a3/volumes" Oct 03 12:23:25 crc kubenswrapper[4990]: I1003 12:23:25.138783 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 12:23:26 crc kubenswrapper[4990]: I1003 12:23:26.415832 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 12:23:27 crc kubenswrapper[4990]: I1003 12:23:27.997194 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 12:23:27 crc kubenswrapper[4990]: I1003 12:23:27.997753 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 12:23:30 crc kubenswrapper[4990]: I1003 12:23:30.138867 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 12:23:30 crc kubenswrapper[4990]: I1003 12:23:30.178104 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 12:23:30 crc kubenswrapper[4990]: I1003 12:23:30.734934 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 12:23:32 crc kubenswrapper[4990]: I1003 12:23:32.967705 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 12:23:32 crc kubenswrapper[4990]: I1003 12:23:32.968580 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 12:23:32 crc kubenswrapper[4990]: I1003 12:23:32.997584 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 12:23:32 crc kubenswrapper[4990]: I1003 12:23:32.997646 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 12:23:33 crc kubenswrapper[4990]: I1003 12:23:33.417575 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 12:23:33 crc kubenswrapper[4990]: I1003 12:23:33.979745 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1d6174c4-8040-4f21-873f-a508d9e83ce3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 12:23:33 crc kubenswrapper[4990]: I1003 12:23:33.979779 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1d6174c4-8040-4f21-873f-a508d9e83ce3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 12:23:34 crc kubenswrapper[4990]: I1003 12:23:34.012721 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="33fda7a9-0a65-4e23-b1c8-0f74f21b9515" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 12:23:34 crc kubenswrapper[4990]: I1003 12:23:34.012767 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="33fda7a9-0a65-4e23-b1c8-0f74f21b9515" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.208:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 12:23:42 crc kubenswrapper[4990]: I1003 12:23:42.976759 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 12:23:42 crc kubenswrapper[4990]: I1003 12:23:42.978086 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 12:23:42 crc kubenswrapper[4990]: I1003 12:23:42.983634 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 12:23:42 crc kubenswrapper[4990]: I1003 12:23:42.992021 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 12:23:43 crc kubenswrapper[4990]: I1003 12:23:43.012261 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 12:23:43 crc kubenswrapper[4990]: I1003 12:23:43.014285 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 12:23:43 crc kubenswrapper[4990]: I1003 12:23:43.020052 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 12:23:43 crc kubenswrapper[4990]: I1003 12:23:43.847284 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 12:23:43 crc kubenswrapper[4990]: I1003 12:23:43.856709 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 12:23:43 crc kubenswrapper[4990]: I1003 12:23:43.857167 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.233683 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l"] Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.235798 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.238722 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.239210 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-54bdl" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.239380 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.239672 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.239869 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.240014 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.240161 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.250333 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l"] Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337098 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337149 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337384 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337432 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337547 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd7qg\" (UniqueName: \"kubernetes.io/projected/f78234e5-0ad8-42d0-9854-c5b70d570beb-kube-api-access-kd7qg\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337619 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337695 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337763 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.337837 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440237 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440313 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440333 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440410 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440431 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440472 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd7qg\" (UniqueName: \"kubernetes.io/projected/f78234e5-0ad8-42d0-9854-c5b70d570beb-kube-api-access-kd7qg\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440507 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440569 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.440651 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.442025 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.446324 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.447066 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.447308 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.447923 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.448642 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.450320 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.451175 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.461761 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd7qg\" (UniqueName: \"kubernetes.io/projected/f78234e5-0ad8-42d0-9854-c5b70d570beb-kube-api-access-kd7qg\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:45 crc kubenswrapper[4990]: I1003 12:23:45.557209 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:23:46 crc kubenswrapper[4990]: I1003 12:23:46.190179 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l"] Oct 03 12:23:46 crc kubenswrapper[4990]: I1003 12:23:46.197048 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 12:23:46 crc kubenswrapper[4990]: I1003 12:23:46.890854 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" event={"ID":"f78234e5-0ad8-42d0-9854-c5b70d570beb","Type":"ContainerStarted","Data":"6d00cbc0fc79fdfa0dab3becca512e52b9a17118d5036ff65049654a4225ab58"} Oct 03 12:23:47 crc kubenswrapper[4990]: I1003 12:23:47.897087 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" event={"ID":"f78234e5-0ad8-42d0-9854-c5b70d570beb","Type":"ContainerStarted","Data":"d78b6c32694055fe026cc9e36a75c18ae40975e6103860b3244fe6ca192bb7b8"} Oct 03 12:23:47 crc kubenswrapper[4990]: I1003 12:23:47.928370 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" podStartSLOduration=2.023055267 podStartE2EDuration="2.928348659s" podCreationTimestamp="2025-10-03 12:23:45 +0000 UTC" firstStartedPulling="2025-10-03 12:23:46.196756439 +0000 UTC m=+9607.993388296" lastFinishedPulling="2025-10-03 12:23:47.102049831 +0000 UTC m=+9608.898681688" observedRunningTime="2025-10-03 12:23:47.916642157 +0000 UTC m=+9609.713274024" watchObservedRunningTime="2025-10-03 12:23:47.928348659 +0000 UTC m=+9609.724980526" Oct 03 12:23:55 crc kubenswrapper[4990]: I1003 12:23:55.304408 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:23:55 crc kubenswrapper[4990]: I1003 12:23:55.305268 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:23:55 crc kubenswrapper[4990]: I1003 12:23:55.329487 4990 scope.go:117] "RemoveContainer" containerID="80ff68c4a3e79b5d970f66d8b340c581812ccd94c0e23b62621d70e3ce5516d4" Oct 03 12:24:25 crc kubenswrapper[4990]: I1003 12:24:25.304152 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:24:25 crc kubenswrapper[4990]: I1003 12:24:25.305152 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.304723 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.306873 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.307091 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.308133 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c393c94c73e4b3f716fa029f83abeacbd9cef0474370be6c770035d13ef9b979"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.308411 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://c393c94c73e4b3f716fa029f83abeacbd9cef0474370be6c770035d13ef9b979" gracePeriod=600 Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.737948 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="c393c94c73e4b3f716fa029f83abeacbd9cef0474370be6c770035d13ef9b979" exitCode=0 Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.738009 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"c393c94c73e4b3f716fa029f83abeacbd9cef0474370be6c770035d13ef9b979"} Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.738443 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672"} Oct 03 12:24:55 crc kubenswrapper[4990]: I1003 12:24:55.738480 4990 scope.go:117] "RemoveContainer" containerID="bc0237eeb153b12e4a7b0156b3bd7bd6d5cb8f50e18ec77b0d1a35b61dafbcc0" Oct 03 12:26:55 crc kubenswrapper[4990]: I1003 12:26:55.303618 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:26:55 crc kubenswrapper[4990]: I1003 12:26:55.304193 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:27:25 crc kubenswrapper[4990]: I1003 12:27:25.304046 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:27:25 crc kubenswrapper[4990]: I1003 12:27:25.304711 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.303419 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.304047 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.304126 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.305369 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.305458 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" gracePeriod=600 Oct 03 12:27:55 crc kubenswrapper[4990]: E1003 12:27:55.540316 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.894333 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" exitCode=0 Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.894383 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672"} Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.894425 4990 scope.go:117] "RemoveContainer" containerID="c393c94c73e4b3f716fa029f83abeacbd9cef0474370be6c770035d13ef9b979" Oct 03 12:27:55 crc kubenswrapper[4990]: I1003 12:27:55.895290 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:27:55 crc kubenswrapper[4990]: E1003 12:27:55.895720 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:28:07 crc kubenswrapper[4990]: I1003 12:28:07.786814 4990 trace.go:236] Trace[1461352936]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (03-Oct-2025 12:28:06.551) (total time: 1235ms): Oct 03 12:28:07 crc kubenswrapper[4990]: Trace[1461352936]: [1.235694689s] [1.235694689s] END Oct 03 12:28:08 crc kubenswrapper[4990]: I1003 12:28:08.900346 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:28:08 crc kubenswrapper[4990]: E1003 12:28:08.901149 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:28:19 crc kubenswrapper[4990]: I1003 12:28:19.872455 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:28:19 crc kubenswrapper[4990]: E1003 12:28:19.873411 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:28:31 crc kubenswrapper[4990]: I1003 12:28:31.872643 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:28:31 crc kubenswrapper[4990]: E1003 12:28:31.873491 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:28:46 crc kubenswrapper[4990]: I1003 12:28:46.872420 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:28:46 crc kubenswrapper[4990]: E1003 12:28:46.873564 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.661801 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w82rj"] Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.665081 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.687013 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w82rj"] Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.810849 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-utilities\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.811275 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-catalog-content\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.811492 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94xrx\" (UniqueName: \"kubernetes.io/projected/61712a27-70de-4c84-a52d-d8e599c0b9f5-kube-api-access-94xrx\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.913889 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-utilities\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.913988 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-catalog-content\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.914028 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94xrx\" (UniqueName: \"kubernetes.io/projected/61712a27-70de-4c84-a52d-d8e599c0b9f5-kube-api-access-94xrx\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.914537 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-utilities\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.914587 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-catalog-content\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:52 crc kubenswrapper[4990]: I1003 12:28:52.933621 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94xrx\" (UniqueName: \"kubernetes.io/projected/61712a27-70de-4c84-a52d-d8e599c0b9f5-kube-api-access-94xrx\") pod \"community-operators-w82rj\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:53 crc kubenswrapper[4990]: I1003 12:28:53.006345 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:28:53 crc kubenswrapper[4990]: I1003 12:28:53.576654 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w82rj"] Oct 03 12:28:54 crc kubenswrapper[4990]: I1003 12:28:54.661118 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w82rj" event={"ID":"61712a27-70de-4c84-a52d-d8e599c0b9f5","Type":"ContainerStarted","Data":"03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a"} Oct 03 12:28:54 crc kubenswrapper[4990]: I1003 12:28:54.661458 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w82rj" event={"ID":"61712a27-70de-4c84-a52d-d8e599c0b9f5","Type":"ContainerStarted","Data":"31cfe62c6c774f78bcb1ed4b3c89c62c98f0e23165256ed4bad97c979827ffaa"} Oct 03 12:28:55 crc kubenswrapper[4990]: I1003 12:28:55.671316 4990 generic.go:334] "Generic (PLEG): container finished" podID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerID="03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a" exitCode=0 Oct 03 12:28:55 crc kubenswrapper[4990]: I1003 12:28:55.671419 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w82rj" event={"ID":"61712a27-70de-4c84-a52d-d8e599c0b9f5","Type":"ContainerDied","Data":"03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a"} Oct 03 12:28:55 crc kubenswrapper[4990]: I1003 12:28:55.674537 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 12:29:00 crc kubenswrapper[4990]: I1003 12:29:00.752422 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w82rj" event={"ID":"61712a27-70de-4c84-a52d-d8e599c0b9f5","Type":"ContainerStarted","Data":"0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc"} Oct 03 12:29:00 crc kubenswrapper[4990]: I1003 12:29:00.872278 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:29:00 crc kubenswrapper[4990]: E1003 12:29:00.872749 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:29:04 crc kubenswrapper[4990]: I1003 12:29:04.816199 4990 generic.go:334] "Generic (PLEG): container finished" podID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerID="0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc" exitCode=0 Oct 03 12:29:04 crc kubenswrapper[4990]: I1003 12:29:04.816269 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w82rj" event={"ID":"61712a27-70de-4c84-a52d-d8e599c0b9f5","Type":"ContainerDied","Data":"0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc"} Oct 03 12:29:06 crc kubenswrapper[4990]: I1003 12:29:06.839910 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w82rj" event={"ID":"61712a27-70de-4c84-a52d-d8e599c0b9f5","Type":"ContainerStarted","Data":"7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a"} Oct 03 12:29:06 crc kubenswrapper[4990]: I1003 12:29:06.856756 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w82rj" podStartSLOduration=4.8058517720000005 podStartE2EDuration="14.856737759s" podCreationTimestamp="2025-10-03 12:28:52 +0000 UTC" firstStartedPulling="2025-10-03 12:28:55.674326467 +0000 UTC m=+9917.470958314" lastFinishedPulling="2025-10-03 12:29:05.725212424 +0000 UTC m=+9927.521844301" observedRunningTime="2025-10-03 12:29:06.855333673 +0000 UTC m=+9928.651965530" watchObservedRunningTime="2025-10-03 12:29:06.856737759 +0000 UTC m=+9928.653369616" Oct 03 12:29:13 crc kubenswrapper[4990]: I1003 12:29:13.007588 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:29:13 crc kubenswrapper[4990]: I1003 12:29:13.008139 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:29:13 crc kubenswrapper[4990]: I1003 12:29:13.059794 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:29:13 crc kubenswrapper[4990]: I1003 12:29:13.995663 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:29:14 crc kubenswrapper[4990]: I1003 12:29:14.057943 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w82rj"] Oct 03 12:29:15 crc kubenswrapper[4990]: I1003 12:29:15.872696 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:29:15 crc kubenswrapper[4990]: E1003 12:29:15.873421 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:29:15 crc kubenswrapper[4990]: I1003 12:29:15.961477 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w82rj" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerName="registry-server" containerID="cri-o://7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a" gracePeriod=2 Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.517552 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.663029 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-catalog-content\") pod \"61712a27-70de-4c84-a52d-d8e599c0b9f5\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.663120 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94xrx\" (UniqueName: \"kubernetes.io/projected/61712a27-70de-4c84-a52d-d8e599c0b9f5-kube-api-access-94xrx\") pod \"61712a27-70de-4c84-a52d-d8e599c0b9f5\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.663231 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-utilities\") pod \"61712a27-70de-4c84-a52d-d8e599c0b9f5\" (UID: \"61712a27-70de-4c84-a52d-d8e599c0b9f5\") " Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.664869 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-utilities" (OuterVolumeSpecName: "utilities") pod "61712a27-70de-4c84-a52d-d8e599c0b9f5" (UID: "61712a27-70de-4c84-a52d-d8e599c0b9f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.669439 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61712a27-70de-4c84-a52d-d8e599c0b9f5-kube-api-access-94xrx" (OuterVolumeSpecName: "kube-api-access-94xrx") pod "61712a27-70de-4c84-a52d-d8e599c0b9f5" (UID: "61712a27-70de-4c84-a52d-d8e599c0b9f5"). InnerVolumeSpecName "kube-api-access-94xrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.747106 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61712a27-70de-4c84-a52d-d8e599c0b9f5" (UID: "61712a27-70de-4c84-a52d-d8e599c0b9f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.767276 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.767335 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94xrx\" (UniqueName: \"kubernetes.io/projected/61712a27-70de-4c84-a52d-d8e599c0b9f5-kube-api-access-94xrx\") on node \"crc\" DevicePath \"\"" Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.767362 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61712a27-70de-4c84-a52d-d8e599c0b9f5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.971680 4990 generic.go:334] "Generic (PLEG): container finished" podID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerID="7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a" exitCode=0 Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.971751 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w82rj" event={"ID":"61712a27-70de-4c84-a52d-d8e599c0b9f5","Type":"ContainerDied","Data":"7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a"} Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.971936 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w82rj" event={"ID":"61712a27-70de-4c84-a52d-d8e599c0b9f5","Type":"ContainerDied","Data":"31cfe62c6c774f78bcb1ed4b3c89c62c98f0e23165256ed4bad97c979827ffaa"} Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.971780 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w82rj" Oct 03 12:29:16 crc kubenswrapper[4990]: I1003 12:29:16.971958 4990 scope.go:117] "RemoveContainer" containerID="7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a" Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.001261 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w82rj"] Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.004237 4990 scope.go:117] "RemoveContainer" containerID="0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc" Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.012084 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w82rj"] Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.041452 4990 scope.go:117] "RemoveContainer" containerID="03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a" Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.083369 4990 scope.go:117] "RemoveContainer" containerID="7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a" Oct 03 12:29:17 crc kubenswrapper[4990]: E1003 12:29:17.087203 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a\": container with ID starting with 7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a not found: ID does not exist" containerID="7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a" Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.087264 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a"} err="failed to get container status \"7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a\": rpc error: code = NotFound desc = could not find container \"7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a\": container with ID starting with 7aae77297f2e2dd2cdd443cd6efca2f6535e6c6b110ac45ce0b6e086c5cffd0a not found: ID does not exist" Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.087299 4990 scope.go:117] "RemoveContainer" containerID="0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc" Oct 03 12:29:17 crc kubenswrapper[4990]: E1003 12:29:17.087913 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc\": container with ID starting with 0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc not found: ID does not exist" containerID="0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc" Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.087941 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc"} err="failed to get container status \"0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc\": rpc error: code = NotFound desc = could not find container \"0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc\": container with ID starting with 0607316fb7c8f56a52d1ef6b78b4463f05627771b435401e01a4ffa49bcbfbdc not found: ID does not exist" Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.087958 4990 scope.go:117] "RemoveContainer" containerID="03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a" Oct 03 12:29:17 crc kubenswrapper[4990]: E1003 12:29:17.089035 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a\": container with ID starting with 03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a not found: ID does not exist" containerID="03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a" Oct 03 12:29:17 crc kubenswrapper[4990]: I1003 12:29:17.089063 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a"} err="failed to get container status \"03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a\": rpc error: code = NotFound desc = could not find container \"03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a\": container with ID starting with 03fc391611972fb6a3865f30ca0cf8f6c7c2d15fcc71d6bf73c48dd3e576bb4a not found: ID does not exist" Oct 03 12:29:18 crc kubenswrapper[4990]: I1003 12:29:18.885441 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" path="/var/lib/kubelet/pods/61712a27-70de-4c84-a52d-d8e599c0b9f5/volumes" Oct 03 12:29:27 crc kubenswrapper[4990]: I1003 12:29:27.872556 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:29:27 crc kubenswrapper[4990]: E1003 12:29:27.873998 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:29:38 crc kubenswrapper[4990]: I1003 12:29:38.897920 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:29:38 crc kubenswrapper[4990]: E1003 12:29:38.898927 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:29:52 crc kubenswrapper[4990]: I1003 12:29:52.871875 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:29:52 crc kubenswrapper[4990]: E1003 12:29:52.874149 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.146778 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s"] Oct 03 12:30:00 crc kubenswrapper[4990]: E1003 12:30:00.148958 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerName="registry-server" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.149070 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerName="registry-server" Oct 03 12:30:00 crc kubenswrapper[4990]: E1003 12:30:00.149174 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerName="extract-utilities" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.149324 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerName="extract-utilities" Oct 03 12:30:00 crc kubenswrapper[4990]: E1003 12:30:00.149450 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerName="extract-content" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.149556 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerName="extract-content" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.149960 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="61712a27-70de-4c84-a52d-d8e599c0b9f5" containerName="registry-server" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.151039 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.166629 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s"] Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.190958 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.191446 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.216040 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a04704a5-6c64-44dc-8405-24c5e5d349b2-secret-volume\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.216467 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a04704a5-6c64-44dc-8405-24c5e5d349b2-config-volume\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.216592 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npppv\" (UniqueName: \"kubernetes.io/projected/a04704a5-6c64-44dc-8405-24c5e5d349b2-kube-api-access-npppv\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.318397 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npppv\" (UniqueName: \"kubernetes.io/projected/a04704a5-6c64-44dc-8405-24c5e5d349b2-kube-api-access-npppv\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.318782 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a04704a5-6c64-44dc-8405-24c5e5d349b2-secret-volume\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.318884 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a04704a5-6c64-44dc-8405-24c5e5d349b2-config-volume\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.320136 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a04704a5-6c64-44dc-8405-24c5e5d349b2-config-volume\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.328688 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a04704a5-6c64-44dc-8405-24c5e5d349b2-secret-volume\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.352619 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npppv\" (UniqueName: \"kubernetes.io/projected/a04704a5-6c64-44dc-8405-24c5e5d349b2-kube-api-access-npppv\") pod \"collect-profiles-29324910-jhw6s\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:00 crc kubenswrapper[4990]: I1003 12:30:00.512623 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:01 crc kubenswrapper[4990]: I1003 12:30:01.043586 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s"] Oct 03 12:30:01 crc kubenswrapper[4990]: I1003 12:30:01.563499 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" event={"ID":"a04704a5-6c64-44dc-8405-24c5e5d349b2","Type":"ContainerStarted","Data":"4eaf5901c90a663b8afe25c1e10bc8f00f4b947ccb57fc77c7f24798e5e4bc99"} Oct 03 12:30:01 crc kubenswrapper[4990]: I1003 12:30:01.563809 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" event={"ID":"a04704a5-6c64-44dc-8405-24c5e5d349b2","Type":"ContainerStarted","Data":"4b6ea7c3ce95d09756ccf2704d048981bfd57c00a455f5f51e19dac418f098a4"} Oct 03 12:30:01 crc kubenswrapper[4990]: I1003 12:30:01.583327 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" podStartSLOduration=1.5833109090000002 podStartE2EDuration="1.583310909s" podCreationTimestamp="2025-10-03 12:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:30:01.580944318 +0000 UTC m=+9983.377576185" watchObservedRunningTime="2025-10-03 12:30:01.583310909 +0000 UTC m=+9983.379942766" Oct 03 12:30:02 crc kubenswrapper[4990]: I1003 12:30:02.580471 4990 generic.go:334] "Generic (PLEG): container finished" podID="a04704a5-6c64-44dc-8405-24c5e5d349b2" containerID="4eaf5901c90a663b8afe25c1e10bc8f00f4b947ccb57fc77c7f24798e5e4bc99" exitCode=0 Oct 03 12:30:02 crc kubenswrapper[4990]: I1003 12:30:02.580573 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" event={"ID":"a04704a5-6c64-44dc-8405-24c5e5d349b2","Type":"ContainerDied","Data":"4eaf5901c90a663b8afe25c1e10bc8f00f4b947ccb57fc77c7f24798e5e4bc99"} Oct 03 12:30:03 crc kubenswrapper[4990]: I1003 12:30:03.872732 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:30:03 crc kubenswrapper[4990]: E1003 12:30:03.873343 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.001555 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.116726 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a04704a5-6c64-44dc-8405-24c5e5d349b2-secret-volume\") pod \"a04704a5-6c64-44dc-8405-24c5e5d349b2\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.116911 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a04704a5-6c64-44dc-8405-24c5e5d349b2-config-volume\") pod \"a04704a5-6c64-44dc-8405-24c5e5d349b2\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.116984 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npppv\" (UniqueName: \"kubernetes.io/projected/a04704a5-6c64-44dc-8405-24c5e5d349b2-kube-api-access-npppv\") pod \"a04704a5-6c64-44dc-8405-24c5e5d349b2\" (UID: \"a04704a5-6c64-44dc-8405-24c5e5d349b2\") " Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.118459 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04704a5-6c64-44dc-8405-24c5e5d349b2-config-volume" (OuterVolumeSpecName: "config-volume") pod "a04704a5-6c64-44dc-8405-24c5e5d349b2" (UID: "a04704a5-6c64-44dc-8405-24c5e5d349b2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.123824 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04704a5-6c64-44dc-8405-24c5e5d349b2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a04704a5-6c64-44dc-8405-24c5e5d349b2" (UID: "a04704a5-6c64-44dc-8405-24c5e5d349b2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.126824 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04704a5-6c64-44dc-8405-24c5e5d349b2-kube-api-access-npppv" (OuterVolumeSpecName: "kube-api-access-npppv") pod "a04704a5-6c64-44dc-8405-24c5e5d349b2" (UID: "a04704a5-6c64-44dc-8405-24c5e5d349b2"). InnerVolumeSpecName "kube-api-access-npppv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.220736 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npppv\" (UniqueName: \"kubernetes.io/projected/a04704a5-6c64-44dc-8405-24c5e5d349b2-kube-api-access-npppv\") on node \"crc\" DevicePath \"\"" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.220789 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a04704a5-6c64-44dc-8405-24c5e5d349b2-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.220809 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a04704a5-6c64-44dc-8405-24c5e5d349b2-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.611641 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" event={"ID":"a04704a5-6c64-44dc-8405-24c5e5d349b2","Type":"ContainerDied","Data":"4b6ea7c3ce95d09756ccf2704d048981bfd57c00a455f5f51e19dac418f098a4"} Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.611698 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b6ea7c3ce95d09756ccf2704d048981bfd57c00a455f5f51e19dac418f098a4" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.611721 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324910-jhw6s" Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.680730 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj"] Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.692360 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324865-h7tzj"] Oct 03 12:30:04 crc kubenswrapper[4990]: I1003 12:30:04.886316 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b184dd-01eb-4959-a5ff-cebaf932f6fe" path="/var/lib/kubelet/pods/c5b184dd-01eb-4959-a5ff-cebaf932f6fe/volumes" Oct 03 12:30:14 crc kubenswrapper[4990]: I1003 12:30:14.872436 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:30:14 crc kubenswrapper[4990]: E1003 12:30:14.873202 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:30:27 crc kubenswrapper[4990]: I1003 12:30:27.873357 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:30:27 crc kubenswrapper[4990]: E1003 12:30:27.874581 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:30:35 crc kubenswrapper[4990]: I1003 12:30:35.710937 4990 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-686dbdfb85-nlj9c" podUID="9153262d-7256-4581-b8d7-bd3575ece8b1" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 03 12:30:41 crc kubenswrapper[4990]: I1003 12:30:41.872339 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:30:41 crc kubenswrapper[4990]: E1003 12:30:41.873313 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:30:52 crc kubenswrapper[4990]: I1003 12:30:52.872598 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:30:52 crc kubenswrapper[4990]: E1003 12:30:52.873417 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:30:55 crc kubenswrapper[4990]: I1003 12:30:55.799307 4990 scope.go:117] "RemoveContainer" containerID="31ebcdb2e66616577868679543f8fedc107d3c95641655f36a795f60c9dc6c85" Oct 03 12:31:03 crc kubenswrapper[4990]: I1003 12:31:03.873290 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:31:03 crc kubenswrapper[4990]: E1003 12:31:03.874424 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:31:16 crc kubenswrapper[4990]: I1003 12:31:16.872212 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:31:16 crc kubenswrapper[4990]: E1003 12:31:16.872994 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:31:31 crc kubenswrapper[4990]: I1003 12:31:31.871960 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:31:31 crc kubenswrapper[4990]: E1003 12:31:31.872821 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:31:43 crc kubenswrapper[4990]: I1003 12:31:43.872781 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:31:43 crc kubenswrapper[4990]: E1003 12:31:43.873802 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:31:57 crc kubenswrapper[4990]: I1003 12:31:57.872208 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:31:57 crc kubenswrapper[4990]: E1003 12:31:57.873455 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:32:10 crc kubenswrapper[4990]: I1003 12:32:10.872503 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:32:10 crc kubenswrapper[4990]: E1003 12:32:10.873242 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:32:24 crc kubenswrapper[4990]: I1003 12:32:24.873758 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:32:24 crc kubenswrapper[4990]: E1003 12:32:24.875421 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:32:39 crc kubenswrapper[4990]: I1003 12:32:39.872086 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:32:39 crc kubenswrapper[4990]: E1003 12:32:39.872929 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:32:54 crc kubenswrapper[4990]: I1003 12:32:54.872298 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:32:54 crc kubenswrapper[4990]: E1003 12:32:54.873078 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:33:06 crc kubenswrapper[4990]: I1003 12:33:06.949601 4990 generic.go:334] "Generic (PLEG): container finished" podID="f78234e5-0ad8-42d0-9854-c5b70d570beb" containerID="d78b6c32694055fe026cc9e36a75c18ae40975e6103860b3244fe6ca192bb7b8" exitCode=0 Oct 03 12:33:06 crc kubenswrapper[4990]: I1003 12:33:06.949701 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" event={"ID":"f78234e5-0ad8-42d0-9854-c5b70d570beb","Type":"ContainerDied","Data":"d78b6c32694055fe026cc9e36a75c18ae40975e6103860b3244fe6ca192bb7b8"} Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.515528 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.549745 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cells-global-config-0\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.549911 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-ssh-key\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.549942 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-0\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.549996 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-combined-ca-bundle\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.550031 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd7qg\" (UniqueName: \"kubernetes.io/projected/f78234e5-0ad8-42d0-9854-c5b70d570beb-kube-api-access-kd7qg\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.550070 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-0\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.550090 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-1\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.550133 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-inventory\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.550175 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-1\") pod \"f78234e5-0ad8-42d0-9854-c5b70d570beb\" (UID: \"f78234e5-0ad8-42d0-9854-c5b70d570beb\") " Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.572991 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78234e5-0ad8-42d0-9854-c5b70d570beb-kube-api-access-kd7qg" (OuterVolumeSpecName: "kube-api-access-kd7qg") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "kube-api-access-kd7qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.573127 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.594209 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.597271 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.597922 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.598316 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.598792 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-inventory" (OuterVolumeSpecName: "inventory") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.600680 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.612015 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f78234e5-0ad8-42d0-9854-c5b70d570beb" (UID: "f78234e5-0ad8-42d0-9854-c5b70d570beb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.652643 4990 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.652897 4990 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.652965 4990 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.653021 4990 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.653084 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd7qg\" (UniqueName: \"kubernetes.io/projected/f78234e5-0ad8-42d0-9854-c5b70d570beb-kube-api-access-kd7qg\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.653140 4990 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.653203 4990 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.653274 4990 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.653337 4990 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f78234e5-0ad8-42d0-9854-c5b70d570beb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.881840 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.976182 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" event={"ID":"f78234e5-0ad8-42d0-9854-c5b70d570beb","Type":"ContainerDied","Data":"6d00cbc0fc79fdfa0dab3becca512e52b9a17118d5036ff65049654a4225ab58"} Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.976765 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d00cbc0fc79fdfa0dab3becca512e52b9a17118d5036ff65049654a4225ab58" Oct 03 12:33:08 crc kubenswrapper[4990]: I1003 12:33:08.976309 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l" Oct 03 12:33:09 crc kubenswrapper[4990]: I1003 12:33:09.990780 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"6fd879fdfac44d0b07716aadc30859aef53a6ca4beeeac86445d46e3e93bf327"} Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.442487 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l2mqh"] Oct 03 12:33:14 crc kubenswrapper[4990]: E1003 12:33:14.443643 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04704a5-6c64-44dc-8405-24c5e5d349b2" containerName="collect-profiles" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.443666 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04704a5-6c64-44dc-8405-24c5e5d349b2" containerName="collect-profiles" Oct 03 12:33:14 crc kubenswrapper[4990]: E1003 12:33:14.443796 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78234e5-0ad8-42d0-9854-c5b70d570beb" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.443809 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78234e5-0ad8-42d0-9854-c5b70d570beb" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.444085 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78234e5-0ad8-42d0-9854-c5b70d570beb" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.444105 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="a04704a5-6c64-44dc-8405-24c5e5d349b2" containerName="collect-profiles" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.445892 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2mqh"] Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.445989 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.493558 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-utilities\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.493712 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s69r\" (UniqueName: \"kubernetes.io/projected/bb202539-fd00-408c-a3b1-ce7c745bfc23-kube-api-access-2s69r\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.493787 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-catalog-content\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.595658 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s69r\" (UniqueName: \"kubernetes.io/projected/bb202539-fd00-408c-a3b1-ce7c745bfc23-kube-api-access-2s69r\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.595881 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-catalog-content\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.595937 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-utilities\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.596554 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-utilities\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.598737 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-catalog-content\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.671069 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s69r\" (UniqueName: \"kubernetes.io/projected/bb202539-fd00-408c-a3b1-ce7c745bfc23-kube-api-access-2s69r\") pod \"redhat-operators-l2mqh\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:14 crc kubenswrapper[4990]: I1003 12:33:14.800842 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:15 crc kubenswrapper[4990]: I1003 12:33:15.263295 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l2mqh"] Oct 03 12:33:16 crc kubenswrapper[4990]: I1003 12:33:16.092773 4990 generic.go:334] "Generic (PLEG): container finished" podID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerID="0e68bca7dca439221e5f6c54a34f5afc51284bd71533dbfa42c5dcefa08132e9" exitCode=0 Oct 03 12:33:16 crc kubenswrapper[4990]: I1003 12:33:16.092825 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2mqh" event={"ID":"bb202539-fd00-408c-a3b1-ce7c745bfc23","Type":"ContainerDied","Data":"0e68bca7dca439221e5f6c54a34f5afc51284bd71533dbfa42c5dcefa08132e9"} Oct 03 12:33:16 crc kubenswrapper[4990]: I1003 12:33:16.092873 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2mqh" event={"ID":"bb202539-fd00-408c-a3b1-ce7c745bfc23","Type":"ContainerStarted","Data":"c3a64b4fb092969f890263e434c31379a0d61d41ca6d4e1d845ce1ec1ce1a2e7"} Oct 03 12:33:18 crc kubenswrapper[4990]: I1003 12:33:18.136211 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2mqh" event={"ID":"bb202539-fd00-408c-a3b1-ce7c745bfc23","Type":"ContainerStarted","Data":"753aed37e28250e1fc8e441756b47b9ad544c02f5dbc944e90d6a688cefa1561"} Oct 03 12:33:22 crc kubenswrapper[4990]: E1003 12:33:22.389364 4990 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb202539_fd00_408c_a3b1_ce7c745bfc23.slice/crio-753aed37e28250e1fc8e441756b47b9ad544c02f5dbc944e90d6a688cefa1561.scope\": RecentStats: unable to find data in memory cache]" Oct 03 12:33:23 crc kubenswrapper[4990]: I1003 12:33:23.224727 4990 generic.go:334] "Generic (PLEG): container finished" podID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerID="753aed37e28250e1fc8e441756b47b9ad544c02f5dbc944e90d6a688cefa1561" exitCode=0 Oct 03 12:33:23 crc kubenswrapper[4990]: I1003 12:33:23.224887 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2mqh" event={"ID":"bb202539-fd00-408c-a3b1-ce7c745bfc23","Type":"ContainerDied","Data":"753aed37e28250e1fc8e441756b47b9ad544c02f5dbc944e90d6a688cefa1561"} Oct 03 12:33:24 crc kubenswrapper[4990]: I1003 12:33:24.237865 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2mqh" event={"ID":"bb202539-fd00-408c-a3b1-ce7c745bfc23","Type":"ContainerStarted","Data":"98ac7bad193dbb90a76da125537dce14bb11e5703451279c14606432afd817d9"} Oct 03 12:33:24 crc kubenswrapper[4990]: I1003 12:33:24.272026 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l2mqh" podStartSLOduration=2.7227498089999997 podStartE2EDuration="10.271997576s" podCreationTimestamp="2025-10-03 12:33:14 +0000 UTC" firstStartedPulling="2025-10-03 12:33:16.095405405 +0000 UTC m=+10177.892037292" lastFinishedPulling="2025-10-03 12:33:23.644653192 +0000 UTC m=+10185.441285059" observedRunningTime="2025-10-03 12:33:24.256105055 +0000 UTC m=+10186.052736912" watchObservedRunningTime="2025-10-03 12:33:24.271997576 +0000 UTC m=+10186.068629473" Oct 03 12:33:24 crc kubenswrapper[4990]: I1003 12:33:24.801015 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:24 crc kubenswrapper[4990]: I1003 12:33:24.801070 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:25 crc kubenswrapper[4990]: I1003 12:33:25.849709 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l2mqh" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="registry-server" probeResult="failure" output=< Oct 03 12:33:25 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 12:33:25 crc kubenswrapper[4990]: > Oct 03 12:33:35 crc kubenswrapper[4990]: I1003 12:33:35.855839 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l2mqh" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="registry-server" probeResult="failure" output=< Oct 03 12:33:35 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 12:33:35 crc kubenswrapper[4990]: > Oct 03 12:33:46 crc kubenswrapper[4990]: I1003 12:33:46.232822 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l2mqh" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="registry-server" probeResult="failure" output=< Oct 03 12:33:46 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 12:33:46 crc kubenswrapper[4990]: > Oct 03 12:33:54 crc kubenswrapper[4990]: I1003 12:33:54.852525 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:54 crc kubenswrapper[4990]: I1003 12:33:54.922868 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:55 crc kubenswrapper[4990]: I1003 12:33:55.111924 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2mqh"] Oct 03 12:33:56 crc kubenswrapper[4990]: I1003 12:33:56.630713 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l2mqh" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="registry-server" containerID="cri-o://98ac7bad193dbb90a76da125537dce14bb11e5703451279c14606432afd817d9" gracePeriod=2 Oct 03 12:33:57 crc kubenswrapper[4990]: I1003 12:33:57.655938 4990 generic.go:334] "Generic (PLEG): container finished" podID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerID="98ac7bad193dbb90a76da125537dce14bb11e5703451279c14606432afd817d9" exitCode=0 Oct 03 12:33:57 crc kubenswrapper[4990]: I1003 12:33:57.655980 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2mqh" event={"ID":"bb202539-fd00-408c-a3b1-ce7c745bfc23","Type":"ContainerDied","Data":"98ac7bad193dbb90a76da125537dce14bb11e5703451279c14606432afd817d9"} Oct 03 12:33:57 crc kubenswrapper[4990]: I1003 12:33:57.907235 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.049837 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-catalog-content\") pod \"bb202539-fd00-408c-a3b1-ce7c745bfc23\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.049928 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-utilities\") pod \"bb202539-fd00-408c-a3b1-ce7c745bfc23\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.050035 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s69r\" (UniqueName: \"kubernetes.io/projected/bb202539-fd00-408c-a3b1-ce7c745bfc23-kube-api-access-2s69r\") pod \"bb202539-fd00-408c-a3b1-ce7c745bfc23\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.050701 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-utilities" (OuterVolumeSpecName: "utilities") pod "bb202539-fd00-408c-a3b1-ce7c745bfc23" (UID: "bb202539-fd00-408c-a3b1-ce7c745bfc23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.059743 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb202539-fd00-408c-a3b1-ce7c745bfc23-kube-api-access-2s69r" (OuterVolumeSpecName: "kube-api-access-2s69r") pod "bb202539-fd00-408c-a3b1-ce7c745bfc23" (UID: "bb202539-fd00-408c-a3b1-ce7c745bfc23"). InnerVolumeSpecName "kube-api-access-2s69r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.151350 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb202539-fd00-408c-a3b1-ce7c745bfc23" (UID: "bb202539-fd00-408c-a3b1-ce7c745bfc23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.151609 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-catalog-content\") pod \"bb202539-fd00-408c-a3b1-ce7c745bfc23\" (UID: \"bb202539-fd00-408c-a3b1-ce7c745bfc23\") " Oct 03 12:33:58 crc kubenswrapper[4990]: W1003 12:33:58.151693 4990 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bb202539-fd00-408c-a3b1-ce7c745bfc23/volumes/kubernetes.io~empty-dir/catalog-content Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.151707 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb202539-fd00-408c-a3b1-ce7c745bfc23" (UID: "bb202539-fd00-408c-a3b1-ce7c745bfc23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.152314 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.152339 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb202539-fd00-408c-a3b1-ce7c745bfc23-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.152352 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s69r\" (UniqueName: \"kubernetes.io/projected/bb202539-fd00-408c-a3b1-ce7c745bfc23-kube-api-access-2s69r\") on node \"crc\" DevicePath \"\"" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.678971 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l2mqh" event={"ID":"bb202539-fd00-408c-a3b1-ce7c745bfc23","Type":"ContainerDied","Data":"c3a64b4fb092969f890263e434c31379a0d61d41ca6d4e1d845ce1ec1ce1a2e7"} Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.679056 4990 scope.go:117] "RemoveContainer" containerID="98ac7bad193dbb90a76da125537dce14bb11e5703451279c14606432afd817d9" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.679065 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l2mqh" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.702055 4990 scope.go:117] "RemoveContainer" containerID="753aed37e28250e1fc8e441756b47b9ad544c02f5dbc944e90d6a688cefa1561" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.713420 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l2mqh"] Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.723406 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l2mqh"] Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.752692 4990 scope.go:117] "RemoveContainer" containerID="0e68bca7dca439221e5f6c54a34f5afc51284bd71533dbfa42c5dcefa08132e9" Oct 03 12:33:58 crc kubenswrapper[4990]: I1003 12:33:58.883993 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" path="/var/lib/kubelet/pods/bb202539-fd00-408c-a3b1-ce7c745bfc23/volumes" Oct 03 12:34:55 crc kubenswrapper[4990]: I1003 12:34:55.549939 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 12:34:55 crc kubenswrapper[4990]: I1003 12:34:55.550733 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="0b9a6e33-94e4-4df7-9d5d-dcadfc621424" containerName="adoption" containerID="cri-o://eaac1eb10a5eb2bae783302519adf0ef16f9b747b48e02bac1a6672913d2f3c8" gracePeriod=30 Oct 03 12:35:25 crc kubenswrapper[4990]: I1003 12:35:25.303805 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:35:25 crc kubenswrapper[4990]: I1003 12:35:25.304602 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:35:25 crc kubenswrapper[4990]: I1003 12:35:25.708766 4990 generic.go:334] "Generic (PLEG): container finished" podID="0b9a6e33-94e4-4df7-9d5d-dcadfc621424" containerID="eaac1eb10a5eb2bae783302519adf0ef16f9b747b48e02bac1a6672913d2f3c8" exitCode=137 Oct 03 12:35:25 crc kubenswrapper[4990]: I1003 12:35:25.708816 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0b9a6e33-94e4-4df7-9d5d-dcadfc621424","Type":"ContainerDied","Data":"eaac1eb10a5eb2bae783302519adf0ef16f9b747b48e02bac1a6672913d2f3c8"} Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.144374 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.180970 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\") pod \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") " Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.181839 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rctf\" (UniqueName: \"kubernetes.io/projected/0b9a6e33-94e4-4df7-9d5d-dcadfc621424-kube-api-access-6rctf\") pod \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\" (UID: \"0b9a6e33-94e4-4df7-9d5d-dcadfc621424\") " Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.188282 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9a6e33-94e4-4df7-9d5d-dcadfc621424-kube-api-access-6rctf" (OuterVolumeSpecName: "kube-api-access-6rctf") pod "0b9a6e33-94e4-4df7-9d5d-dcadfc621424" (UID: "0b9a6e33-94e4-4df7-9d5d-dcadfc621424"). InnerVolumeSpecName "kube-api-access-6rctf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.236707 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7" (OuterVolumeSpecName: "mariadb-data") pod "0b9a6e33-94e4-4df7-9d5d-dcadfc621424" (UID: "0b9a6e33-94e4-4df7-9d5d-dcadfc621424"). InnerVolumeSpecName "pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.283961 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rctf\" (UniqueName: \"kubernetes.io/projected/0b9a6e33-94e4-4df7-9d5d-dcadfc621424-kube-api-access-6rctf\") on node \"crc\" DevicePath \"\"" Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.284051 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\") on node \"crc\" " Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.321785 4990 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.322764 4990 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7") on node "crc" Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.385506 4990 reconciler_common.go:293] "Volume detached for volume \"pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16c5c4f9-ba72-479d-9049-9a2a1674c7f7\") on node \"crc\" DevicePath \"\"" Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.719774 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0b9a6e33-94e4-4df7-9d5d-dcadfc621424","Type":"ContainerDied","Data":"808b2abfb8a8b141d0378e7c4cd776d8745c9df1135258a70843b81abdf00514"} Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.719824 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.719840 4990 scope.go:117] "RemoveContainer" containerID="eaac1eb10a5eb2bae783302519adf0ef16f9b747b48e02bac1a6672913d2f3c8" Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.763243 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.774612 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 12:35:26 crc kubenswrapper[4990]: I1003 12:35:26.885441 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9a6e33-94e4-4df7-9d5d-dcadfc621424" path="/var/lib/kubelet/pods/0b9a6e33-94e4-4df7-9d5d-dcadfc621424/volumes" Oct 03 12:35:27 crc kubenswrapper[4990]: I1003 12:35:27.518553 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 12:35:27 crc kubenswrapper[4990]: I1003 12:35:27.519048 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="5241c94f-789d-4e82-84d1-8765ee56934a" containerName="adoption" containerID="cri-o://7449e52444e458abc0f544390d150c62f721fb6e1e4e75cb66786effe49ad66a" gracePeriod=30 Oct 03 12:35:55 crc kubenswrapper[4990]: I1003 12:35:55.304163 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:35:55 crc kubenswrapper[4990]: I1003 12:35:55.304890 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.106145 4990 generic.go:334] "Generic (PLEG): container finished" podID="5241c94f-789d-4e82-84d1-8765ee56934a" containerID="7449e52444e458abc0f544390d150c62f721fb6e1e4e75cb66786effe49ad66a" exitCode=137 Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.106215 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5241c94f-789d-4e82-84d1-8765ee56934a","Type":"ContainerDied","Data":"7449e52444e458abc0f544390d150c62f721fb6e1e4e75cb66786effe49ad66a"} Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.538770 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.667551 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5241c94f-789d-4e82-84d1-8765ee56934a-ovn-data-cert\") pod \"5241c94f-789d-4e82-84d1-8765ee56934a\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.668444 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945\") pod \"5241c94f-789d-4e82-84d1-8765ee56934a\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.668735 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvs7f\" (UniqueName: \"kubernetes.io/projected/5241c94f-789d-4e82-84d1-8765ee56934a-kube-api-access-rvs7f\") pod \"5241c94f-789d-4e82-84d1-8765ee56934a\" (UID: \"5241c94f-789d-4e82-84d1-8765ee56934a\") " Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.675252 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5241c94f-789d-4e82-84d1-8765ee56934a-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "5241c94f-789d-4e82-84d1-8765ee56934a" (UID: "5241c94f-789d-4e82-84d1-8765ee56934a"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.675358 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5241c94f-789d-4e82-84d1-8765ee56934a-kube-api-access-rvs7f" (OuterVolumeSpecName: "kube-api-access-rvs7f") pod "5241c94f-789d-4e82-84d1-8765ee56934a" (UID: "5241c94f-789d-4e82-84d1-8765ee56934a"). InnerVolumeSpecName "kube-api-access-rvs7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.692543 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945" (OuterVolumeSpecName: "ovn-data") pod "5241c94f-789d-4e82-84d1-8765ee56934a" (UID: "5241c94f-789d-4e82-84d1-8765ee56934a"). InnerVolumeSpecName "pvc-b057a844-300d-44e0-9a41-622edc620945". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.770901 4990 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5241c94f-789d-4e82-84d1-8765ee56934a-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.770960 4990 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b057a844-300d-44e0-9a41-622edc620945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945\") on node \"crc\" " Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.770972 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvs7f\" (UniqueName: \"kubernetes.io/projected/5241c94f-789d-4e82-84d1-8765ee56934a-kube-api-access-rvs7f\") on node \"crc\" DevicePath \"\"" Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.819629 4990 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.819824 4990 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b057a844-300d-44e0-9a41-622edc620945" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945") on node "crc" Oct 03 12:35:58 crc kubenswrapper[4990]: I1003 12:35:58.872354 4990 reconciler_common.go:293] "Volume detached for volume \"pvc-b057a844-300d-44e0-9a41-622edc620945\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b057a844-300d-44e0-9a41-622edc620945\") on node \"crc\" DevicePath \"\"" Oct 03 12:35:59 crc kubenswrapper[4990]: I1003 12:35:59.118154 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5241c94f-789d-4e82-84d1-8765ee56934a","Type":"ContainerDied","Data":"00ba4025e9b1253112eb96c83b0fa1ba53902f2c32474c44697dc9845c0febb9"} Oct 03 12:35:59 crc kubenswrapper[4990]: I1003 12:35:59.118213 4990 scope.go:117] "RemoveContainer" containerID="7449e52444e458abc0f544390d150c62f721fb6e1e4e75cb66786effe49ad66a" Oct 03 12:35:59 crc kubenswrapper[4990]: I1003 12:35:59.118365 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 12:35:59 crc kubenswrapper[4990]: I1003 12:35:59.156540 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 12:35:59 crc kubenswrapper[4990]: I1003 12:35:59.171113 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 12:36:00 crc kubenswrapper[4990]: I1003 12:36:00.892842 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5241c94f-789d-4e82-84d1-8765ee56934a" path="/var/lib/kubelet/pods/5241c94f-789d-4e82-84d1-8765ee56934a/volumes" Oct 03 12:36:25 crc kubenswrapper[4990]: I1003 12:36:25.307686 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:36:25 crc kubenswrapper[4990]: I1003 12:36:25.308814 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:36:25 crc kubenswrapper[4990]: I1003 12:36:25.308900 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 12:36:25 crc kubenswrapper[4990]: I1003 12:36:25.310278 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fd879fdfac44d0b07716aadc30859aef53a6ca4beeeac86445d46e3e93bf327"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:36:25 crc kubenswrapper[4990]: I1003 12:36:25.310377 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://6fd879fdfac44d0b07716aadc30859aef53a6ca4beeeac86445d46e3e93bf327" gracePeriod=600 Oct 03 12:36:26 crc kubenswrapper[4990]: I1003 12:36:26.441962 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="6fd879fdfac44d0b07716aadc30859aef53a6ca4beeeac86445d46e3e93bf327" exitCode=0 Oct 03 12:36:26 crc kubenswrapper[4990]: I1003 12:36:26.442131 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"6fd879fdfac44d0b07716aadc30859aef53a6ca4beeeac86445d46e3e93bf327"} Oct 03 12:36:26 crc kubenswrapper[4990]: I1003 12:36:26.442496 4990 scope.go:117] "RemoveContainer" containerID="7aecf19edcda2891e66598ca8d09b9c8b169781c5dfaf783835bd3902988b672" Oct 03 12:36:27 crc kubenswrapper[4990]: I1003 12:36:27.455235 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01"} Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.450488 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ggmzc/must-gather-2bw77"] Oct 03 12:37:02 crc kubenswrapper[4990]: E1003 12:37:02.451399 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="registry-server" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.451411 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="registry-server" Oct 03 12:37:02 crc kubenswrapper[4990]: E1003 12:37:02.451427 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5241c94f-789d-4e82-84d1-8765ee56934a" containerName="adoption" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.451434 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="5241c94f-789d-4e82-84d1-8765ee56934a" containerName="adoption" Oct 03 12:37:02 crc kubenswrapper[4990]: E1003 12:37:02.451460 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="extract-content" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.451466 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="extract-content" Oct 03 12:37:02 crc kubenswrapper[4990]: E1003 12:37:02.451473 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="extract-utilities" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.451479 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="extract-utilities" Oct 03 12:37:02 crc kubenswrapper[4990]: E1003 12:37:02.451491 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9a6e33-94e4-4df7-9d5d-dcadfc621424" containerName="adoption" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.451497 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9a6e33-94e4-4df7-9d5d-dcadfc621424" containerName="adoption" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.451700 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9a6e33-94e4-4df7-9d5d-dcadfc621424" containerName="adoption" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.451712 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb202539-fd00-408c-a3b1-ce7c745bfc23" containerName="registry-server" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.451726 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="5241c94f-789d-4e82-84d1-8765ee56934a" containerName="adoption" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.452913 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.455311 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ggmzc"/"openshift-service-ca.crt" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.456034 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ggmzc"/"kube-root-ca.crt" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.456729 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ggmzc"/"default-dockercfg-twlnd" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.461342 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ggmzc/must-gather-2bw77"] Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.521749 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15f06efd-0a47-438e-a547-50e70209f63d-must-gather-output\") pod \"must-gather-2bw77\" (UID: \"15f06efd-0a47-438e-a547-50e70209f63d\") " pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.521975 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndv46\" (UniqueName: \"kubernetes.io/projected/15f06efd-0a47-438e-a547-50e70209f63d-kube-api-access-ndv46\") pod \"must-gather-2bw77\" (UID: \"15f06efd-0a47-438e-a547-50e70209f63d\") " pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.624324 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15f06efd-0a47-438e-a547-50e70209f63d-must-gather-output\") pod \"must-gather-2bw77\" (UID: \"15f06efd-0a47-438e-a547-50e70209f63d\") " pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.624535 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndv46\" (UniqueName: \"kubernetes.io/projected/15f06efd-0a47-438e-a547-50e70209f63d-kube-api-access-ndv46\") pod \"must-gather-2bw77\" (UID: \"15f06efd-0a47-438e-a547-50e70209f63d\") " pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.624880 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15f06efd-0a47-438e-a547-50e70209f63d-must-gather-output\") pod \"must-gather-2bw77\" (UID: \"15f06efd-0a47-438e-a547-50e70209f63d\") " pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.644385 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndv46\" (UniqueName: \"kubernetes.io/projected/15f06efd-0a47-438e-a547-50e70209f63d-kube-api-access-ndv46\") pod \"must-gather-2bw77\" (UID: \"15f06efd-0a47-438e-a547-50e70209f63d\") " pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:37:02 crc kubenswrapper[4990]: I1003 12:37:02.773564 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:37:03 crc kubenswrapper[4990]: I1003 12:37:03.329479 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 12:37:03 crc kubenswrapper[4990]: I1003 12:37:03.330386 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ggmzc/must-gather-2bw77"] Oct 03 12:37:03 crc kubenswrapper[4990]: I1003 12:37:03.920912 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/must-gather-2bw77" event={"ID":"15f06efd-0a47-438e-a547-50e70209f63d","Type":"ContainerStarted","Data":"111fb4852cd903ffb170a2543a41dac2e904ec69e9a5761546cc4bcc6f2c424e"} Oct 03 12:37:10 crc kubenswrapper[4990]: I1003 12:37:10.009732 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/must-gather-2bw77" event={"ID":"15f06efd-0a47-438e-a547-50e70209f63d","Type":"ContainerStarted","Data":"84b68b65fcc3d4ccb615776a2091f6366064c9c62091e01537f1361127657121"} Oct 03 12:37:11 crc kubenswrapper[4990]: I1003 12:37:11.022786 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/must-gather-2bw77" event={"ID":"15f06efd-0a47-438e-a547-50e70209f63d","Type":"ContainerStarted","Data":"f78ee03bbad3d3a6b6e4314b9a7b3467aa2ca8b401afcc131962a29788df18ea"} Oct 03 12:37:11 crc kubenswrapper[4990]: I1003 12:37:11.049433 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ggmzc/must-gather-2bw77" podStartSLOduration=2.795644145 podStartE2EDuration="9.049402047s" podCreationTimestamp="2025-10-03 12:37:02 +0000 UTC" firstStartedPulling="2025-10-03 12:37:03.329288786 +0000 UTC m=+10405.125920643" lastFinishedPulling="2025-10-03 12:37:09.583046688 +0000 UTC m=+10411.379678545" observedRunningTime="2025-10-03 12:37:11.041286207 +0000 UTC m=+10412.837918104" watchObservedRunningTime="2025-10-03 12:37:11.049402047 +0000 UTC m=+10412.846033944" Oct 03 12:37:13 crc kubenswrapper[4990]: I1003 12:37:13.781670 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-hvtvd"] Oct 03 12:37:13 crc kubenswrapper[4990]: I1003 12:37:13.783473 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:37:13 crc kubenswrapper[4990]: I1003 12:37:13.893357 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9p69\" (UniqueName: \"kubernetes.io/projected/29d82216-4aa6-4fea-822d-d9123b9586a9-kube-api-access-l9p69\") pod \"crc-debug-hvtvd\" (UID: \"29d82216-4aa6-4fea-822d-d9123b9586a9\") " pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:37:13 crc kubenswrapper[4990]: I1003 12:37:13.893793 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d82216-4aa6-4fea-822d-d9123b9586a9-host\") pod \"crc-debug-hvtvd\" (UID: \"29d82216-4aa6-4fea-822d-d9123b9586a9\") " pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:37:13 crc kubenswrapper[4990]: I1003 12:37:13.995390 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d82216-4aa6-4fea-822d-d9123b9586a9-host\") pod \"crc-debug-hvtvd\" (UID: \"29d82216-4aa6-4fea-822d-d9123b9586a9\") " pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:37:13 crc kubenswrapper[4990]: I1003 12:37:13.995522 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9p69\" (UniqueName: \"kubernetes.io/projected/29d82216-4aa6-4fea-822d-d9123b9586a9-kube-api-access-l9p69\") pod \"crc-debug-hvtvd\" (UID: \"29d82216-4aa6-4fea-822d-d9123b9586a9\") " pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:37:13 crc kubenswrapper[4990]: I1003 12:37:13.995564 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d82216-4aa6-4fea-822d-d9123b9586a9-host\") pod \"crc-debug-hvtvd\" (UID: \"29d82216-4aa6-4fea-822d-d9123b9586a9\") " pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:37:14 crc kubenswrapper[4990]: I1003 12:37:14.013471 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9p69\" (UniqueName: \"kubernetes.io/projected/29d82216-4aa6-4fea-822d-d9123b9586a9-kube-api-access-l9p69\") pod \"crc-debug-hvtvd\" (UID: \"29d82216-4aa6-4fea-822d-d9123b9586a9\") " pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:37:14 crc kubenswrapper[4990]: I1003 12:37:14.101374 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:37:14 crc kubenswrapper[4990]: W1003 12:37:14.241306 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29d82216_4aa6_4fea_822d_d9123b9586a9.slice/crio-b7f22dcb173b40f6106fba280412027299276cbdac3b22a05dc39bb205a02a1c WatchSource:0}: Error finding container b7f22dcb173b40f6106fba280412027299276cbdac3b22a05dc39bb205a02a1c: Status 404 returned error can't find the container with id b7f22dcb173b40f6106fba280412027299276cbdac3b22a05dc39bb205a02a1c Oct 03 12:37:15 crc kubenswrapper[4990]: I1003 12:37:15.118130 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" event={"ID":"29d82216-4aa6-4fea-822d-d9123b9586a9","Type":"ContainerStarted","Data":"b7f22dcb173b40f6106fba280412027299276cbdac3b22a05dc39bb205a02a1c"} Oct 03 12:37:27 crc kubenswrapper[4990]: I1003 12:37:27.279093 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" event={"ID":"29d82216-4aa6-4fea-822d-d9123b9586a9","Type":"ContainerStarted","Data":"b773e09589c50ba4c82c49f2e2e27e4210bab62a2f2bfcf2a272d00def146c1f"} Oct 03 12:37:27 crc kubenswrapper[4990]: I1003 12:37:27.300563 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" podStartSLOduration=2.324203427 podStartE2EDuration="14.300547898s" podCreationTimestamp="2025-10-03 12:37:13 +0000 UTC" firstStartedPulling="2025-10-03 12:37:14.250835813 +0000 UTC m=+10416.047467670" lastFinishedPulling="2025-10-03 12:37:26.227180284 +0000 UTC m=+10428.023812141" observedRunningTime="2025-10-03 12:37:27.290678662 +0000 UTC m=+10429.087310539" watchObservedRunningTime="2025-10-03 12:37:27.300547898 +0000 UTC m=+10429.097179755" Oct 03 12:37:56 crc kubenswrapper[4990]: I1003 12:37:56.824224 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sqx7m"] Oct 03 12:37:56 crc kubenswrapper[4990]: I1003 12:37:56.827596 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:56 crc kubenswrapper[4990]: I1003 12:37:56.853375 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqx7m"] Oct 03 12:37:56 crc kubenswrapper[4990]: I1003 12:37:56.941910 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcv2\" (UniqueName: \"kubernetes.io/projected/17f855a0-3961-4b34-a175-c0a9c94bee58-kube-api-access-pdcv2\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:56 crc kubenswrapper[4990]: I1003 12:37:56.942044 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-catalog-content\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:56 crc kubenswrapper[4990]: I1003 12:37:56.942998 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-utilities\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:57 crc kubenswrapper[4990]: I1003 12:37:57.045649 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-catalog-content\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:57 crc kubenswrapper[4990]: I1003 12:37:57.045773 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-utilities\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:57 crc kubenswrapper[4990]: I1003 12:37:57.045862 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcv2\" (UniqueName: \"kubernetes.io/projected/17f855a0-3961-4b34-a175-c0a9c94bee58-kube-api-access-pdcv2\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:57 crc kubenswrapper[4990]: I1003 12:37:57.046132 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-catalog-content\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:57 crc kubenswrapper[4990]: I1003 12:37:57.046333 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-utilities\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:57 crc kubenswrapper[4990]: I1003 12:37:57.101673 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcv2\" (UniqueName: \"kubernetes.io/projected/17f855a0-3961-4b34-a175-c0a9c94bee58-kube-api-access-pdcv2\") pod \"certified-operators-sqx7m\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:57 crc kubenswrapper[4990]: I1003 12:37:57.166575 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:37:57 crc kubenswrapper[4990]: I1003 12:37:57.801104 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqx7m"] Oct 03 12:37:58 crc kubenswrapper[4990]: I1003 12:37:58.634372 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqx7m" event={"ID":"17f855a0-3961-4b34-a175-c0a9c94bee58","Type":"ContainerStarted","Data":"b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1"} Oct 03 12:37:58 crc kubenswrapper[4990]: I1003 12:37:58.634788 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqx7m" event={"ID":"17f855a0-3961-4b34-a175-c0a9c94bee58","Type":"ContainerStarted","Data":"eb60fabfb0f7f9a8685191907b76cf25360bb1f286987680da15f4403bcd56b4"} Oct 03 12:37:59 crc kubenswrapper[4990]: I1003 12:37:59.647356 4990 generic.go:334] "Generic (PLEG): container finished" podID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerID="b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1" exitCode=0 Oct 03 12:37:59 crc kubenswrapper[4990]: I1003 12:37:59.647550 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqx7m" event={"ID":"17f855a0-3961-4b34-a175-c0a9c94bee58","Type":"ContainerDied","Data":"b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1"} Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.001065 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bn26p"] Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.011884 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.066537 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn26p"] Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.168108 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlqv9\" (UniqueName: \"kubernetes.io/projected/87ef93a0-d5a2-47a0-af09-5c210e589b3d-kube-api-access-tlqv9\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.168305 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-utilities\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.168344 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-catalog-content\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.269905 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlqv9\" (UniqueName: \"kubernetes.io/projected/87ef93a0-d5a2-47a0-af09-5c210e589b3d-kube-api-access-tlqv9\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.270052 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-utilities\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.270076 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-catalog-content\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.270648 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-catalog-content\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.270693 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-utilities\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.291402 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlqv9\" (UniqueName: \"kubernetes.io/projected/87ef93a0-d5a2-47a0-af09-5c210e589b3d-kube-api-access-tlqv9\") pod \"redhat-marketplace-bn26p\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:02 crc kubenswrapper[4990]: I1003 12:38:02.345275 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:38:03 crc kubenswrapper[4990]: I1003 12:38:03.197414 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn26p"] Oct 03 12:38:03 crc kubenswrapper[4990]: I1003 12:38:03.734780 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn26p" event={"ID":"87ef93a0-d5a2-47a0-af09-5c210e589b3d","Type":"ContainerStarted","Data":"a3e1b316458bfa73e0eca3c0f0af7c27c75f7893c931ab76c1a3923deaff130c"} Oct 03 12:38:05 crc kubenswrapper[4990]: I1003 12:38:05.775750 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn26p" event={"ID":"87ef93a0-d5a2-47a0-af09-5c210e589b3d","Type":"ContainerStarted","Data":"69ead2eee8be9c7d1c4c8514460b223b253ec758ee3b2459fa692a7736141406"} Oct 03 12:38:06 crc kubenswrapper[4990]: I1003 12:38:06.786773 4990 generic.go:334] "Generic (PLEG): container finished" podID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerID="69ead2eee8be9c7d1c4c8514460b223b253ec758ee3b2459fa692a7736141406" exitCode=0 Oct 03 12:38:06 crc kubenswrapper[4990]: I1003 12:38:06.786875 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn26p" event={"ID":"87ef93a0-d5a2-47a0-af09-5c210e589b3d","Type":"ContainerDied","Data":"69ead2eee8be9c7d1c4c8514460b223b253ec758ee3b2459fa692a7736141406"} Oct 03 12:38:15 crc kubenswrapper[4990]: I1003 12:38:15.892961 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqx7m" event={"ID":"17f855a0-3961-4b34-a175-c0a9c94bee58","Type":"ContainerStarted","Data":"11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1"} Oct 03 12:38:32 crc kubenswrapper[4990]: I1003 12:38:32.060710 4990 generic.go:334] "Generic (PLEG): container finished" podID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerID="11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1" exitCode=0 Oct 03 12:38:32 crc kubenswrapper[4990]: I1003 12:38:32.061571 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqx7m" event={"ID":"17f855a0-3961-4b34-a175-c0a9c94bee58","Type":"ContainerDied","Data":"11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1"} Oct 03 12:38:37 crc kubenswrapper[4990]: I1003 12:38:37.135584 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn26p" event={"ID":"87ef93a0-d5a2-47a0-af09-5c210e589b3d","Type":"ContainerStarted","Data":"dbec29cd5511114c735b22e90c9e48d39447fa1bd75fe9d6701150badd0a1332"} Oct 03 12:38:44 crc kubenswrapper[4990]: I1003 12:38:44.212577 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqx7m" event={"ID":"17f855a0-3961-4b34-a175-c0a9c94bee58","Type":"ContainerStarted","Data":"1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498"} Oct 03 12:38:45 crc kubenswrapper[4990]: I1003 12:38:45.250659 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sqx7m" podStartSLOduration=7.645921638 podStartE2EDuration="49.250636572s" podCreationTimestamp="2025-10-03 12:37:56 +0000 UTC" firstStartedPulling="2025-10-03 12:37:59.649565807 +0000 UTC m=+10461.446197664" lastFinishedPulling="2025-10-03 12:38:41.254280701 +0000 UTC m=+10503.050912598" observedRunningTime="2025-10-03 12:38:45.243450146 +0000 UTC m=+10507.040082033" watchObservedRunningTime="2025-10-03 12:38:45.250636572 +0000 UTC m=+10507.047268439" Oct 03 12:38:47 crc kubenswrapper[4990]: I1003 12:38:47.166908 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:38:47 crc kubenswrapper[4990]: I1003 12:38:47.167388 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:38:47 crc kubenswrapper[4990]: I1003 12:38:47.251001 4990 generic.go:334] "Generic (PLEG): container finished" podID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerID="dbec29cd5511114c735b22e90c9e48d39447fa1bd75fe9d6701150badd0a1332" exitCode=0 Oct 03 12:38:47 crc kubenswrapper[4990]: I1003 12:38:47.251060 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn26p" event={"ID":"87ef93a0-d5a2-47a0-af09-5c210e589b3d","Type":"ContainerDied","Data":"dbec29cd5511114c735b22e90c9e48d39447fa1bd75fe9d6701150badd0a1332"} Oct 03 12:38:48 crc kubenswrapper[4990]: I1003 12:38:48.241062 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sqx7m" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="registry-server" probeResult="failure" output=< Oct 03 12:38:48 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 12:38:48 crc kubenswrapper[4990]: > Oct 03 12:38:54 crc kubenswrapper[4990]: I1003 12:38:54.414904 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn26p" event={"ID":"87ef93a0-d5a2-47a0-af09-5c210e589b3d","Type":"ContainerStarted","Data":"ee4f2e4a9bc59901ab77818ca38dce61d7b8ba70c552bd2fbba2fe3edaec81cc"} Oct 03 12:38:54 crc kubenswrapper[4990]: I1003 12:38:54.443616 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bn26p" podStartSLOduration=10.388406598 podStartE2EDuration="53.443591651s" podCreationTimestamp="2025-10-03 12:38:01 +0000 UTC" firstStartedPulling="2025-10-03 12:38:09.155618563 +0000 UTC m=+10470.952250430" lastFinishedPulling="2025-10-03 12:38:52.210803636 +0000 UTC m=+10514.007435483" observedRunningTime="2025-10-03 12:38:54.443367875 +0000 UTC m=+10516.239999742" watchObservedRunningTime="2025-10-03 12:38:54.443591651 +0000 UTC m=+10516.240223508" Oct 03 12:38:55 crc kubenswrapper[4990]: I1003 12:38:55.304780 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:38:55 crc kubenswrapper[4990]: I1003 12:38:55.305138 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:38:57 crc kubenswrapper[4990]: I1003 12:38:57.234324 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:38:57 crc kubenswrapper[4990]: I1003 12:38:57.293327 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:38:58 crc kubenswrapper[4990]: I1003 12:38:58.050041 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqx7m"] Oct 03 12:38:58 crc kubenswrapper[4990]: I1003 12:38:58.457036 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sqx7m" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="registry-server" containerID="cri-o://1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498" gracePeriod=2 Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.276179 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.339024 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-catalog-content\") pod \"17f855a0-3961-4b34-a175-c0a9c94bee58\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.339206 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdcv2\" (UniqueName: \"kubernetes.io/projected/17f855a0-3961-4b34-a175-c0a9c94bee58-kube-api-access-pdcv2\") pod \"17f855a0-3961-4b34-a175-c0a9c94bee58\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.339353 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-utilities\") pod \"17f855a0-3961-4b34-a175-c0a9c94bee58\" (UID: \"17f855a0-3961-4b34-a175-c0a9c94bee58\") " Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.341402 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-utilities" (OuterVolumeSpecName: "utilities") pod "17f855a0-3961-4b34-a175-c0a9c94bee58" (UID: "17f855a0-3961-4b34-a175-c0a9c94bee58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.353488 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f855a0-3961-4b34-a175-c0a9c94bee58-kube-api-access-pdcv2" (OuterVolumeSpecName: "kube-api-access-pdcv2") pod "17f855a0-3961-4b34-a175-c0a9c94bee58" (UID: "17f855a0-3961-4b34-a175-c0a9c94bee58"). InnerVolumeSpecName "kube-api-access-pdcv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.406812 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17f855a0-3961-4b34-a175-c0a9c94bee58" (UID: "17f855a0-3961-4b34-a175-c0a9c94bee58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.443673 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.444459 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17f855a0-3961-4b34-a175-c0a9c94bee58-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.444597 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdcv2\" (UniqueName: \"kubernetes.io/projected/17f855a0-3961-4b34-a175-c0a9c94bee58-kube-api-access-pdcv2\") on node \"crc\" DevicePath \"\"" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.468688 4990 generic.go:334] "Generic (PLEG): container finished" podID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerID="1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498" exitCode=0 Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.468744 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqx7m" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.468889 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqx7m" event={"ID":"17f855a0-3961-4b34-a175-c0a9c94bee58","Type":"ContainerDied","Data":"1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498"} Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.470744 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqx7m" event={"ID":"17f855a0-3961-4b34-a175-c0a9c94bee58","Type":"ContainerDied","Data":"eb60fabfb0f7f9a8685191907b76cf25360bb1f286987680da15f4403bcd56b4"} Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.470766 4990 scope.go:117] "RemoveContainer" containerID="1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.502344 4990 scope.go:117] "RemoveContainer" containerID="11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.515957 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqx7m"] Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.526241 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sqx7m"] Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.568371 4990 scope.go:117] "RemoveContainer" containerID="b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.605961 4990 scope.go:117] "RemoveContainer" containerID="1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498" Oct 03 12:38:59 crc kubenswrapper[4990]: E1003 12:38:59.606525 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498\": container with ID starting with 1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498 not found: ID does not exist" containerID="1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.606571 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498"} err="failed to get container status \"1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498\": rpc error: code = NotFound desc = could not find container \"1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498\": container with ID starting with 1b4d70b09820be66ea9f1d0f2e9e758f39f2f92c5bc1bb1f4508ff5f813cc498 not found: ID does not exist" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.606598 4990 scope.go:117] "RemoveContainer" containerID="11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1" Oct 03 12:38:59 crc kubenswrapper[4990]: E1003 12:38:59.606887 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1\": container with ID starting with 11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1 not found: ID does not exist" containerID="11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.606929 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1"} err="failed to get container status \"11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1\": rpc error: code = NotFound desc = could not find container \"11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1\": container with ID starting with 11122f70c7be56982b034f22666668f36cc1345fb0c2eb2001b88ab11442c4b1 not found: ID does not exist" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.606954 4990 scope.go:117] "RemoveContainer" containerID="b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1" Oct 03 12:38:59 crc kubenswrapper[4990]: E1003 12:38:59.607178 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1\": container with ID starting with b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1 not found: ID does not exist" containerID="b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1" Oct 03 12:38:59 crc kubenswrapper[4990]: I1003 12:38:59.607207 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1"} err="failed to get container status \"b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1\": rpc error: code = NotFound desc = could not find container \"b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1\": container with ID starting with b8e47c04a56c1ddfd8ace8f40a7ae847d00ffc81fbaeda621a5adce539f9e5f1 not found: ID does not exist" Oct 03 12:39:00 crc kubenswrapper[4990]: I1003 12:39:00.883891 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" path="/var/lib/kubelet/pods/17f855a0-3961-4b34-a175-c0a9c94bee58/volumes" Oct 03 12:39:02 crc kubenswrapper[4990]: I1003 12:39:02.346162 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:39:02 crc kubenswrapper[4990]: I1003 12:39:02.346540 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:39:02 crc kubenswrapper[4990]: I1003 12:39:02.404897 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:39:02 crc kubenswrapper[4990]: I1003 12:39:02.565633 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:39:03 crc kubenswrapper[4990]: I1003 12:39:03.451052 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn26p"] Oct 03 12:39:04 crc kubenswrapper[4990]: I1003 12:39:04.547762 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bn26p" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerName="registry-server" containerID="cri-o://ee4f2e4a9bc59901ab77818ca38dce61d7b8ba70c552bd2fbba2fe3edaec81cc" gracePeriod=2 Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.579654 4990 generic.go:334] "Generic (PLEG): container finished" podID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerID="ee4f2e4a9bc59901ab77818ca38dce61d7b8ba70c552bd2fbba2fe3edaec81cc" exitCode=0 Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.579769 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn26p" event={"ID":"87ef93a0-d5a2-47a0-af09-5c210e589b3d","Type":"ContainerDied","Data":"ee4f2e4a9bc59901ab77818ca38dce61d7b8ba70c552bd2fbba2fe3edaec81cc"} Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.580195 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bn26p" event={"ID":"87ef93a0-d5a2-47a0-af09-5c210e589b3d","Type":"ContainerDied","Data":"a3e1b316458bfa73e0eca3c0f0af7c27c75f7893c931ab76c1a3923deaff130c"} Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.580217 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3e1b316458bfa73e0eca3c0f0af7c27c75f7893c931ab76c1a3923deaff130c" Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.634346 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.701829 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-catalog-content\") pod \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.701965 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-utilities\") pod \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.702004 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlqv9\" (UniqueName: \"kubernetes.io/projected/87ef93a0-d5a2-47a0-af09-5c210e589b3d-kube-api-access-tlqv9\") pod \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\" (UID: \"87ef93a0-d5a2-47a0-af09-5c210e589b3d\") " Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.703916 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-utilities" (OuterVolumeSpecName: "utilities") pod "87ef93a0-d5a2-47a0-af09-5c210e589b3d" (UID: "87ef93a0-d5a2-47a0-af09-5c210e589b3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.709607 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ef93a0-d5a2-47a0-af09-5c210e589b3d-kube-api-access-tlqv9" (OuterVolumeSpecName: "kube-api-access-tlqv9") pod "87ef93a0-d5a2-47a0-af09-5c210e589b3d" (UID: "87ef93a0-d5a2-47a0-af09-5c210e589b3d"). InnerVolumeSpecName "kube-api-access-tlqv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.727883 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87ef93a0-d5a2-47a0-af09-5c210e589b3d" (UID: "87ef93a0-d5a2-47a0-af09-5c210e589b3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.806455 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.806496 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87ef93a0-d5a2-47a0-af09-5c210e589b3d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:39:05 crc kubenswrapper[4990]: I1003 12:39:05.806535 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlqv9\" (UniqueName: \"kubernetes.io/projected/87ef93a0-d5a2-47a0-af09-5c210e589b3d-kube-api-access-tlqv9\") on node \"crc\" DevicePath \"\"" Oct 03 12:39:06 crc kubenswrapper[4990]: I1003 12:39:06.591333 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bn26p" Oct 03 12:39:06 crc kubenswrapper[4990]: I1003 12:39:06.633888 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn26p"] Oct 03 12:39:06 crc kubenswrapper[4990]: I1003 12:39:06.646444 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bn26p"] Oct 03 12:39:06 crc kubenswrapper[4990]: I1003 12:39:06.883379 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" path="/var/lib/kubelet/pods/87ef93a0-d5a2-47a0-af09-5c210e589b3d/volumes" Oct 03 12:39:07 crc kubenswrapper[4990]: I1003 12:39:07.314749 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9d0ce5a3-e78d-4cfa-a405-596641465359/init-config-reloader/0.log" Oct 03 12:39:07 crc kubenswrapper[4990]: I1003 12:39:07.604705 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9d0ce5a3-e78d-4cfa-a405-596641465359/init-config-reloader/0.log" Oct 03 12:39:07 crc kubenswrapper[4990]: I1003 12:39:07.620771 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9d0ce5a3-e78d-4cfa-a405-596641465359/alertmanager/0.log" Oct 03 12:39:07 crc kubenswrapper[4990]: I1003 12:39:07.955004 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9d0ce5a3-e78d-4cfa-a405-596641465359/config-reloader/0.log" Oct 03 12:39:08 crc kubenswrapper[4990]: I1003 12:39:08.085937 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6cc089cc-aaba-4402-875d-c0abdd5a051e/aodh-api/0.log" Oct 03 12:39:08 crc kubenswrapper[4990]: I1003 12:39:08.252271 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6cc089cc-aaba-4402-875d-c0abdd5a051e/aodh-listener/0.log" Oct 03 12:39:08 crc kubenswrapper[4990]: I1003 12:39:08.319487 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6cc089cc-aaba-4402-875d-c0abdd5a051e/aodh-evaluator/0.log" Oct 03 12:39:08 crc kubenswrapper[4990]: I1003 12:39:08.425478 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6cc089cc-aaba-4402-875d-c0abdd5a051e/aodh-notifier/0.log" Oct 03 12:39:08 crc kubenswrapper[4990]: I1003 12:39:08.624032 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f7b959788-gvwmt_a570c5ff-a74e-4741-8baa-036a4dfd9423/barbican-api/0.log" Oct 03 12:39:08 crc kubenswrapper[4990]: I1003 12:39:08.848340 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f7b959788-gvwmt_a570c5ff-a74e-4741-8baa-036a4dfd9423/barbican-api-log/0.log" Oct 03 12:39:08 crc kubenswrapper[4990]: I1003 12:39:08.861669 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c959bd486-zgkx8_745cbe6e-a927-48db-af8e-f94d85b6f484/barbican-keystone-listener/0.log" Oct 03 12:39:09 crc kubenswrapper[4990]: I1003 12:39:09.121740 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c959bd486-zgkx8_745cbe6e-a927-48db-af8e-f94d85b6f484/barbican-keystone-listener-log/0.log" Oct 03 12:39:09 crc kubenswrapper[4990]: I1003 12:39:09.171182 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-75cfdf8777-z9jlb_5c935d0e-a506-4979-b44f-16d8ea969d14/barbican-worker/0.log" Oct 03 12:39:09 crc kubenswrapper[4990]: I1003 12:39:09.407855 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-75cfdf8777-z9jlb_5c935d0e-a506-4979-b44f-16d8ea969d14/barbican-worker-log/0.log" Oct 03 12:39:09 crc kubenswrapper[4990]: I1003 12:39:09.446941 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-9nksq_ddc5184e-8071-42ec-85d3-2bf5eba352ab/bootstrap-openstack-openstack-cell1/0.log" Oct 03 12:39:09 crc kubenswrapper[4990]: I1003 12:39:09.674706 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6029224-b0b3-4ec3-a6b9-0745dc24f55c/ceilometer-central-agent/0.log" Oct 03 12:39:09 crc kubenswrapper[4990]: I1003 12:39:09.853500 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6029224-b0b3-4ec3-a6b9-0745dc24f55c/sg-core/0.log" Oct 03 12:39:10 crc kubenswrapper[4990]: I1003 12:39:10.077590 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6029224-b0b3-4ec3-a6b9-0745dc24f55c/proxy-httpd/0.log" Oct 03 12:39:10 crc kubenswrapper[4990]: I1003 12:39:10.255065 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6029224-b0b3-4ec3-a6b9-0745dc24f55c/ceilometer-notification-agent/0.log" Oct 03 12:39:10 crc kubenswrapper[4990]: I1003 12:39:10.344500 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a2fed11a-2d54-499e-b4ad-415a874de5dc/cinder-api-log/0.log" Oct 03 12:39:10 crc kubenswrapper[4990]: I1003 12:39:10.364197 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a2fed11a-2d54-499e-b4ad-415a874de5dc/cinder-api/0.log" Oct 03 12:39:10 crc kubenswrapper[4990]: I1003 12:39:10.572416 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_24ec0c7b-2701-4816-9a36-771572d653f0/cinder-scheduler/0.log" Oct 03 12:39:10 crc kubenswrapper[4990]: I1003 12:39:10.657022 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_24ec0c7b-2701-4816-9a36-771572d653f0/probe/0.log" Oct 03 12:39:10 crc kubenswrapper[4990]: I1003 12:39:10.757373 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-fst44_e826ef9d-06da-4c2c-a889-d7788e2ca694/configure-network-openstack-openstack-cell1/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.003762 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-pf4zt_0347a2f6-e939-4c8c-bc86-45622a71f3d4/configure-os-openstack-openstack-cell1/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.178250 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb556d4d7-dhjcp_7e7e4c39-9c12-4a92-8bd0-842048d14be5/init/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.339536 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb556d4d7-dhjcp_7e7e4c39-9c12-4a92-8bd0-842048d14be5/init/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.443038 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb556d4d7-dhjcp_7e7e4c39-9c12-4a92-8bd0-842048d14be5/dnsmasq-dns/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.582248 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-zvplg_29d6d8d9-57b8-4256-b831-940e5798ab5c/download-cache-openstack-openstack-cell1/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.706895 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ac650573-e608-46d6-852b-d31c897b95e1/glance-httpd/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.806806 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ac650573-e608-46d6-852b-d31c897b95e1/glance-log/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.922755 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d04849bc-a827-49d5-8ccc-901073a459cf/glance-httpd/0.log" Oct 03 12:39:11 crc kubenswrapper[4990]: I1003 12:39:11.987785 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d04849bc-a827-49d5-8ccc-901073a459cf/glance-log/0.log" Oct 03 12:39:12 crc kubenswrapper[4990]: I1003 12:39:12.595868 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-9b7c9fb-mfzpc_7e2cb064-93e7-4455-b395-ef9e266be90a/heat-api/0.log" Oct 03 12:39:12 crc kubenswrapper[4990]: I1003 12:39:12.794817 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6c667dbb67-dlps2_6f7c56db-6bb3-4a55-a41c-a8140008baf6/heat-engine/0.log" Oct 03 12:39:12 crc kubenswrapper[4990]: I1003 12:39:12.942360 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7db7b4869c-gx756_aaa8edfc-e5aa-4d3c-bee3-06249d9623ad/heat-cfnapi/0.log" Oct 03 12:39:13 crc kubenswrapper[4990]: I1003 12:39:13.092266 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-c86f77854-vcjzs_b22cd619-3c49-40f7-98c4-9d3e07566fab/horizon/0.log" Oct 03 12:39:13 crc kubenswrapper[4990]: I1003 12:39:13.843716 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-c86f77854-vcjzs_b22cd619-3c49-40f7-98c4-9d3e07566fab/horizon-log/0.log" Oct 03 12:39:13 crc kubenswrapper[4990]: I1003 12:39:13.939939 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-q4nck_abbd0668-e891-45f4-b884-8876d504f476/install-certs-openstack-openstack-cell1/0.log" Oct 03 12:39:14 crc kubenswrapper[4990]: I1003 12:39:14.131612 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-kppbz_204687ab-1f96-4f28-aa69-baaab8fcb8df/install-os-openstack-openstack-cell1/0.log" Oct 03 12:39:14 crc kubenswrapper[4990]: I1003 12:39:14.486541 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29324881-h7424_b1f19dc4-6a33-415e-ba7d-90d5d8835f0e/keystone-cron/0.log" Oct 03 12:39:14 crc kubenswrapper[4990]: I1003 12:39:14.536112 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8459bb745d-tcmjm_1b2e7a2e-0ed1-4004-a487-24b50c0c956d/keystone-api/0.log" Oct 03 12:39:14 crc kubenswrapper[4990]: I1003 12:39:14.669256 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_75d40fc0-48e9-410e-b2f7-2b10e6e9a58b/kube-state-metrics/0.log" Oct 03 12:39:14 crc kubenswrapper[4990]: I1003 12:39:14.914259 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-zjd6k_724cbe0c-9b28-4fc2-acdc-b67f8518acfe/libvirt-openstack-openstack-cell1/0.log" Oct 03 12:39:15 crc kubenswrapper[4990]: I1003 12:39:15.910732 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-844c8c7569-rln6q_8e437b5a-e091-4f97-94d4-4a490365e5fa/neutron-api/0.log" Oct 03 12:39:16 crc kubenswrapper[4990]: I1003 12:39:16.070157 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-844c8c7569-rln6q_8e437b5a-e091-4f97-94d4-4a490365e5fa/neutron-httpd/0.log" Oct 03 12:39:16 crc kubenswrapper[4990]: I1003 12:39:16.669325 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-d77pm_11994fd9-0d27-4cb9-9871-40e9416961fd/neutron-dhcp-openstack-openstack-cell1/0.log" Oct 03 12:39:16 crc kubenswrapper[4990]: I1003 12:39:16.933257 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-dj8b7_dd955a7e-39fc-4115-94d4-f578031c03ca/neutron-metadata-openstack-openstack-cell1/0.log" Oct 03 12:39:17 crc kubenswrapper[4990]: I1003 12:39:17.196183 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-rj68z_e3834efd-692e-42cb-ab5f-3e3d8ebb3e7c/neutron-sriov-openstack-openstack-cell1/0.log" Oct 03 12:39:17 crc kubenswrapper[4990]: I1003 12:39:17.629941 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1d6174c4-8040-4f21-873f-a508d9e83ce3/nova-api-api/0.log" Oct 03 12:39:17 crc kubenswrapper[4990]: I1003 12:39:17.654916 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_1d6174c4-8040-4f21-873f-a508d9e83ce3/nova-api-log/0.log" Oct 03 12:39:17 crc kubenswrapper[4990]: I1003 12:39:17.914967 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fdcc49a5-24b8-4815-80b8-5d3edf557360/nova-cell0-conductor-conductor/0.log" Oct 03 12:39:18 crc kubenswrapper[4990]: I1003 12:39:18.189190 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a982af33-ad1d-409f-86f4-eb132a83dfa8/nova-cell1-conductor-conductor/0.log" Oct 03 12:39:18 crc kubenswrapper[4990]: I1003 12:39:18.526212 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6ee22400-6c67-47ea-8660-221b585bf898/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 12:39:18 crc kubenswrapper[4990]: I1003 12:39:18.975605 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellbhd8l_f78234e5-0ad8-42d0-9854-c5b70d570beb/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Oct 03 12:39:19 crc kubenswrapper[4990]: I1003 12:39:19.386404 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-5rbm6_e96962be-8334-4bd3-af96-db55ea084ef2/nova-cell1-openstack-openstack-cell1/0.log" Oct 03 12:39:19 crc kubenswrapper[4990]: I1003 12:39:19.690722 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_33fda7a9-0a65-4e23-b1c8-0f74f21b9515/nova-metadata-log/0.log" Oct 03 12:39:20 crc kubenswrapper[4990]: I1003 12:39:20.126040 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_33fda7a9-0a65-4e23-b1c8-0f74f21b9515/nova-metadata-metadata/0.log" Oct 03 12:39:20 crc kubenswrapper[4990]: I1003 12:39:20.233918 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_989baf10-df5f-4e5d-a3af-0475029519e5/nova-scheduler-scheduler/0.log" Oct 03 12:39:20 crc kubenswrapper[4990]: I1003 12:39:20.527792 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-786b7b89f7-lbh87_72d51b92-4d67-465b-8e49-41ba2069572f/init/0.log" Oct 03 12:39:20 crc kubenswrapper[4990]: I1003 12:39:20.642424 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-786b7b89f7-lbh87_72d51b92-4d67-465b-8e49-41ba2069572f/init/0.log" Oct 03 12:39:20 crc kubenswrapper[4990]: I1003 12:39:20.893248 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-786b7b89f7-lbh87_72d51b92-4d67-465b-8e49-41ba2069572f/octavia-api-provider-agent/0.log" Oct 03 12:39:21 crc kubenswrapper[4990]: I1003 12:39:21.025461 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-786b7b89f7-lbh87_72d51b92-4d67-465b-8e49-41ba2069572f/octavia-api/0.log" Oct 03 12:39:21 crc kubenswrapper[4990]: I1003 12:39:21.295401 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hdwv4_e4bb434c-f4bf-4f15-b354-e10c0c481296/init/0.log" Oct 03 12:39:21 crc kubenswrapper[4990]: I1003 12:39:21.396097 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hdwv4_e4bb434c-f4bf-4f15-b354-e10c0c481296/init/0.log" Oct 03 12:39:21 crc kubenswrapper[4990]: I1003 12:39:21.579705 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hdwv4_e4bb434c-f4bf-4f15-b354-e10c0c481296/octavia-healthmanager/0.log" Oct 03 12:39:21 crc kubenswrapper[4990]: I1003 12:39:21.813110 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-wlmv4_925949a3-bcd4-45b8-ad30-1a86bec81360/init/0.log" Oct 03 12:39:22 crc kubenswrapper[4990]: I1003 12:39:22.004312 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-wlmv4_925949a3-bcd4-45b8-ad30-1a86bec81360/init/0.log" Oct 03 12:39:22 crc kubenswrapper[4990]: I1003 12:39:22.054703 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-wlmv4_925949a3-bcd4-45b8-ad30-1a86bec81360/octavia-housekeeping/0.log" Oct 03 12:39:22 crc kubenswrapper[4990]: I1003 12:39:22.274893 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-nzgf2_c3095d7b-e123-45ac-873c-6bf4b14b1f21/init/0.log" Oct 03 12:39:22 crc kubenswrapper[4990]: I1003 12:39:22.473650 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-nzgf2_c3095d7b-e123-45ac-873c-6bf4b14b1f21/init/0.log" Oct 03 12:39:22 crc kubenswrapper[4990]: I1003 12:39:22.525892 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-678599687f-nzgf2_c3095d7b-e123-45ac-873c-6bf4b14b1f21/octavia-amphora-httpd/0.log" Oct 03 12:39:23 crc kubenswrapper[4990]: I1003 12:39:23.285052 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-hp5sv_6d8de341-400b-42bb-b230-f7408b6e9075/init/0.log" Oct 03 12:39:23 crc kubenswrapper[4990]: I1003 12:39:23.455875 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-hp5sv_6d8de341-400b-42bb-b230-f7408b6e9075/init/0.log" Oct 03 12:39:23 crc kubenswrapper[4990]: I1003 12:39:23.554805 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-hp5sv_6d8de341-400b-42bb-b230-f7408b6e9075/octavia-rsyslog/0.log" Oct 03 12:39:23 crc kubenswrapper[4990]: I1003 12:39:23.765778 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-v2rc9_23a35a64-97d4-407e-8505-9eb966f5932f/init/0.log" Oct 03 12:39:23 crc kubenswrapper[4990]: I1003 12:39:23.956673 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-v2rc9_23a35a64-97d4-407e-8505-9eb966f5932f/init/0.log" Oct 03 12:39:24 crc kubenswrapper[4990]: I1003 12:39:24.108238 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-v2rc9_23a35a64-97d4-407e-8505-9eb966f5932f/octavia-worker/0.log" Oct 03 12:39:24 crc kubenswrapper[4990]: I1003 12:39:24.373741 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8b68b675-4162-4159-9d49-ab3ee3c5a6e2/mysql-bootstrap/0.log" Oct 03 12:39:24 crc kubenswrapper[4990]: I1003 12:39:24.969751 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8b68b675-4162-4159-9d49-ab3ee3c5a6e2/mysql-bootstrap/0.log" Oct 03 12:39:25 crc kubenswrapper[4990]: I1003 12:39:25.009101 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8b68b675-4162-4159-9d49-ab3ee3c5a6e2/galera/0.log" Oct 03 12:39:25 crc kubenswrapper[4990]: I1003 12:39:25.253460 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2f2afdb-bc52-41b3-bfb2-05cd189491ea/mysql-bootstrap/0.log" Oct 03 12:39:25 crc kubenswrapper[4990]: I1003 12:39:25.304100 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:39:25 crc kubenswrapper[4990]: I1003 12:39:25.304160 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:39:25 crc kubenswrapper[4990]: I1003 12:39:25.516036 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2f2afdb-bc52-41b3-bfb2-05cd189491ea/galera/0.log" Oct 03 12:39:25 crc kubenswrapper[4990]: I1003 12:39:25.524035 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_c2f2afdb-bc52-41b3-bfb2-05cd189491ea/mysql-bootstrap/0.log" Oct 03 12:39:25 crc kubenswrapper[4990]: I1003 12:39:25.734094 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4e1ed4fe-6135-4f8f-a1e0-b37f79804c2a/openstackclient/0.log" Oct 03 12:39:26 crc kubenswrapper[4990]: I1003 12:39:26.028695 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-964vn_3970bc67-319d-4676-b0f7-d0323d315f77/openstack-network-exporter/0.log" Oct 03 12:39:26 crc kubenswrapper[4990]: I1003 12:39:26.264905 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4brg4_dfc0151d-547f-4110-a1ea-707ba0e18796/ovsdb-server-init/0.log" Oct 03 12:39:26 crc kubenswrapper[4990]: I1003 12:39:26.526565 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4brg4_dfc0151d-547f-4110-a1ea-707ba0e18796/ovsdb-server-init/0.log" Oct 03 12:39:26 crc kubenswrapper[4990]: I1003 12:39:26.580790 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4brg4_dfc0151d-547f-4110-a1ea-707ba0e18796/ovs-vswitchd/0.log" Oct 03 12:39:26 crc kubenswrapper[4990]: I1003 12:39:26.738814 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4brg4_dfc0151d-547f-4110-a1ea-707ba0e18796/ovsdb-server/0.log" Oct 03 12:39:26 crc kubenswrapper[4990]: I1003 12:39:26.926103 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vlqm5_4692a72f-d9d5-40ea-bd63-e8c939b254e9/ovn-controller/0.log" Oct 03 12:39:27 crc kubenswrapper[4990]: I1003 12:39:27.171636 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ebc9ec16-e9c6-4e49-895e-9ff733ff125d/openstack-network-exporter/0.log" Oct 03 12:39:27 crc kubenswrapper[4990]: I1003 12:39:27.367430 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ebc9ec16-e9c6-4e49-895e-9ff733ff125d/ovn-northd/0.log" Oct 03 12:39:27 crc kubenswrapper[4990]: I1003 12:39:27.821245 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-dhz6j_ac3694d2-235d-4f91-a2ac-eb6b627112b3/ovn-openstack-openstack-cell1/0.log" Oct 03 12:39:27 crc kubenswrapper[4990]: I1003 12:39:27.930153 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b02ea1a2-ca33-4f2c-8279-d7b210f06c02/openstack-network-exporter/0.log" Oct 03 12:39:28 crc kubenswrapper[4990]: I1003 12:39:28.031600 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b02ea1a2-ca33-4f2c-8279-d7b210f06c02/ovsdbserver-nb/0.log" Oct 03 12:39:28 crc kubenswrapper[4990]: I1003 12:39:28.210856 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_91033b00-fa38-456f-b530-86e9a3570cd4/openstack-network-exporter/0.log" Oct 03 12:39:28 crc kubenswrapper[4990]: I1003 12:39:28.419333 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_91033b00-fa38-456f-b530-86e9a3570cd4/ovsdbserver-nb/0.log" Oct 03 12:39:28 crc kubenswrapper[4990]: I1003 12:39:28.630039 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_a6ba951f-b8cb-45d4-8c4d-2d43a008877b/openstack-network-exporter/0.log" Oct 03 12:39:28 crc kubenswrapper[4990]: I1003 12:39:28.761488 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_a6ba951f-b8cb-45d4-8c4d-2d43a008877b/ovsdbserver-nb/0.log" Oct 03 12:39:28 crc kubenswrapper[4990]: I1003 12:39:28.985686 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5bcb4406-33eb-4a5a-8e65-2ab0267ed97b/openstack-network-exporter/0.log" Oct 03 12:39:29 crc kubenswrapper[4990]: I1003 12:39:29.097764 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5bcb4406-33eb-4a5a-8e65-2ab0267ed97b/ovsdbserver-sb/0.log" Oct 03 12:39:29 crc kubenswrapper[4990]: I1003 12:39:29.354710 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6311fec4-d49f-4a01-a145-8e9afc3dded6/openstack-network-exporter/0.log" Oct 03 12:39:29 crc kubenswrapper[4990]: I1003 12:39:29.506408 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6311fec4-d49f-4a01-a145-8e9afc3dded6/ovsdbserver-sb/0.log" Oct 03 12:39:29 crc kubenswrapper[4990]: I1003 12:39:29.909552 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_82cfebd8-f54d-4ea6-8c90-8f125ab5f21d/openstack-network-exporter/0.log" Oct 03 12:39:30 crc kubenswrapper[4990]: I1003 12:39:30.183329 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_82cfebd8-f54d-4ea6-8c90-8f125ab5f21d/ovsdbserver-sb/0.log" Oct 03 12:39:30 crc kubenswrapper[4990]: I1003 12:39:30.458620 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-575f56f894-g5gdh_fce8c270-053f-4b51-ab1d-6f92dabb2ed2/placement-api/0.log" Oct 03 12:39:30 crc kubenswrapper[4990]: I1003 12:39:30.497609 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-575f56f894-g5gdh_fce8c270-053f-4b51-ab1d-6f92dabb2ed2/placement-log/0.log" Oct 03 12:39:30 crc kubenswrapper[4990]: I1003 12:39:30.938534 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c9vhrn_b2ba5395-b429-4f21-9cdb-6845ab7b1716/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 03 12:39:31 crc kubenswrapper[4990]: I1003 12:39:31.180206 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_68ed3972-355d-4bd7-a2db-91814bf69cef/init-config-reloader/0.log" Oct 03 12:39:31 crc kubenswrapper[4990]: I1003 12:39:31.373397 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_68ed3972-355d-4bd7-a2db-91814bf69cef/config-reloader/0.log" Oct 03 12:39:31 crc kubenswrapper[4990]: I1003 12:39:31.627527 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_68ed3972-355d-4bd7-a2db-91814bf69cef/prometheus/0.log" Oct 03 12:39:31 crc kubenswrapper[4990]: I1003 12:39:31.644272 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_68ed3972-355d-4bd7-a2db-91814bf69cef/init-config-reloader/0.log" Oct 03 12:39:31 crc kubenswrapper[4990]: I1003 12:39:31.814866 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_68ed3972-355d-4bd7-a2db-91814bf69cef/thanos-sidecar/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.010111 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00f14153-c741-40c4-8cc9-7535f429d860/setup-container/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.088772 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8c316927-e4ca-496b-96a2-f31602456e6d/memcached/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.293668 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00f14153-c741-40c4-8cc9-7535f429d860/setup-container/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.307282 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00f14153-c741-40c4-8cc9-7535f429d860/rabbitmq/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.476738 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7cb7a78f-fac0-4afd-872a-edf928061dbd/setup-container/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.652624 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7cb7a78f-fac0-4afd-872a-edf928061dbd/setup-container/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.734620 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7cb7a78f-fac0-4afd-872a-edf928061dbd/rabbitmq/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.845954 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-kgw28_e9b036e9-0a70-4dbf-abc9-9a366d7e1126/reboot-os-openstack-openstack-cell1/0.log" Oct 03 12:39:32 crc kubenswrapper[4990]: I1003 12:39:32.964405 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-w5tgt_52183779-8954-4290-9dad-d7135f152399/run-os-openstack-openstack-cell1/0.log" Oct 03 12:39:33 crc kubenswrapper[4990]: I1003 12:39:33.970317 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-bczw5_8ebf76a3-140d-477c-81e3-e623baa47115/ssh-known-hosts-openstack/0.log" Oct 03 12:39:34 crc kubenswrapper[4990]: I1003 12:39:34.190438 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-686dbdfb85-nlj9c_9153262d-7256-4581-b8d7-bd3575ece8b1/proxy-httpd/0.log" Oct 03 12:39:34 crc kubenswrapper[4990]: I1003 12:39:34.227221 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-686dbdfb85-nlj9c_9153262d-7256-4581-b8d7-bd3575ece8b1/proxy-server/0.log" Oct 03 12:39:34 crc kubenswrapper[4990]: I1003 12:39:34.454365 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2dmzm_9964f731-6bb0-46e1-879c-2201ad1c280c/swift-ring-rebalance/0.log" Oct 03 12:39:34 crc kubenswrapper[4990]: I1003 12:39:34.609473 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-ssglj_1d028b25-53ad-418d-a47b-3bee28cdd317/telemetry-openstack-openstack-cell1/0.log" Oct 03 12:39:34 crc kubenswrapper[4990]: I1003 12:39:34.803357 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-tsllg_2df99b4a-9b37-4525-923f-469c5d607ce9/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 03 12:39:34 crc kubenswrapper[4990]: I1003 12:39:34.963582 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-gmcbb_2503d26b-59e5-4e31-816f-89eb184fcb78/validate-network-openstack-openstack-cell1/0.log" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.401278 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4dn4r"] Oct 03 12:39:52 crc kubenswrapper[4990]: E1003 12:39:52.402165 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="registry-server" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.402178 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="registry-server" Oct 03 12:39:52 crc kubenswrapper[4990]: E1003 12:39:52.402194 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="extract-content" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.402200 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="extract-content" Oct 03 12:39:52 crc kubenswrapper[4990]: E1003 12:39:52.402208 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerName="extract-content" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.402214 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerName="extract-content" Oct 03 12:39:52 crc kubenswrapper[4990]: E1003 12:39:52.402226 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="extract-utilities" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.402232 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="extract-utilities" Oct 03 12:39:52 crc kubenswrapper[4990]: E1003 12:39:52.402254 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerName="registry-server" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.402261 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerName="registry-server" Oct 03 12:39:52 crc kubenswrapper[4990]: E1003 12:39:52.402273 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerName="extract-utilities" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.402278 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerName="extract-utilities" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.402468 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ef93a0-d5a2-47a0-af09-5c210e589b3d" containerName="registry-server" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.402487 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f855a0-3961-4b34-a175-c0a9c94bee58" containerName="registry-server" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.403981 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.423943 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4dn4r"] Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.508950 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-catalog-content\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.509026 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9zt\" (UniqueName: \"kubernetes.io/projected/c928e03b-f3bb-4b3c-b02c-50b62887689a-kube-api-access-nw9zt\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.509142 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-utilities\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.610986 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-catalog-content\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.611079 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9zt\" (UniqueName: \"kubernetes.io/projected/c928e03b-f3bb-4b3c-b02c-50b62887689a-kube-api-access-nw9zt\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.611210 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-utilities\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.611734 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-utilities\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.611960 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-catalog-content\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.630209 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9zt\" (UniqueName: \"kubernetes.io/projected/c928e03b-f3bb-4b3c-b02c-50b62887689a-kube-api-access-nw9zt\") pod \"community-operators-4dn4r\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:52 crc kubenswrapper[4990]: I1003 12:39:52.723534 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:54.375680 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4dn4r"] Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:55.090492 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dn4r" event={"ID":"c928e03b-f3bb-4b3c-b02c-50b62887689a","Type":"ContainerStarted","Data":"21bd2f99fa1dd7cc394553d58a6713f02ed4566520ee451077e61eb53a2ace94"} Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:55.304565 4990 patch_prober.go:28] interesting pod/machine-config-daemon-68v62 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:55.304615 4990 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:55.304662 4990 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-68v62" Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:55.305388 4990 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01"} pod="openshift-machine-config-operator/machine-config-daemon-68v62" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:55.305432 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerName="machine-config-daemon" containerID="cri-o://1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" gracePeriod=600 Oct 03 12:39:56 crc kubenswrapper[4990]: E1003 12:39:55.939243 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:56.104898 4990 generic.go:334] "Generic (PLEG): container finished" podID="f21ea38c-26da-4987-a50d-bafecdfbbd02" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" exitCode=0 Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:56.105063 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerDied","Data":"1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01"} Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:56.105103 4990 scope.go:117] "RemoveContainer" containerID="6fd879fdfac44d0b07716aadc30859aef53a6ca4beeeac86445d46e3e93bf327" Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:56.105715 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:39:56 crc kubenswrapper[4990]: E1003 12:39:56.106009 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:56.112477 4990 generic.go:334] "Generic (PLEG): container finished" podID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerID="37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca" exitCode=0 Oct 03 12:39:56 crc kubenswrapper[4990]: I1003 12:39:56.112587 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dn4r" event={"ID":"c928e03b-f3bb-4b3c-b02c-50b62887689a","Type":"ContainerDied","Data":"37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca"} Oct 03 12:39:57 crc kubenswrapper[4990]: I1003 12:39:57.123639 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:39:57 crc kubenswrapper[4990]: E1003 12:39:57.124193 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:39:59 crc kubenswrapper[4990]: I1003 12:39:59.148533 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dn4r" event={"ID":"c928e03b-f3bb-4b3c-b02c-50b62887689a","Type":"ContainerStarted","Data":"6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496"} Oct 03 12:40:05 crc kubenswrapper[4990]: I1003 12:40:05.219913 4990 generic.go:334] "Generic (PLEG): container finished" podID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerID="6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496" exitCode=0 Oct 03 12:40:05 crc kubenswrapper[4990]: I1003 12:40:05.220401 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dn4r" event={"ID":"c928e03b-f3bb-4b3c-b02c-50b62887689a","Type":"ContainerDied","Data":"6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496"} Oct 03 12:40:08 crc kubenswrapper[4990]: I1003 12:40:08.249384 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dn4r" event={"ID":"c928e03b-f3bb-4b3c-b02c-50b62887689a","Type":"ContainerStarted","Data":"b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd"} Oct 03 12:40:08 crc kubenswrapper[4990]: I1003 12:40:08.278696 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4dn4r" podStartSLOduration=5.799193571 podStartE2EDuration="16.278676132s" podCreationTimestamp="2025-10-03 12:39:52 +0000 UTC" firstStartedPulling="2025-10-03 12:39:56.114911499 +0000 UTC m=+10577.911543396" lastFinishedPulling="2025-10-03 12:40:06.5943941 +0000 UTC m=+10588.391025957" observedRunningTime="2025-10-03 12:40:08.275717046 +0000 UTC m=+10590.072348913" watchObservedRunningTime="2025-10-03 12:40:08.278676132 +0000 UTC m=+10590.075307989" Oct 03 12:40:10 crc kubenswrapper[4990]: I1003 12:40:10.872737 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:40:10 crc kubenswrapper[4990]: E1003 12:40:10.873875 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:40:12 crc kubenswrapper[4990]: I1003 12:40:12.724647 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:40:12 crc kubenswrapper[4990]: I1003 12:40:12.724981 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:40:13 crc kubenswrapper[4990]: I1003 12:40:13.779102 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4dn4r" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="registry-server" probeResult="failure" output=< Oct 03 12:40:13 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 12:40:13 crc kubenswrapper[4990]: > Oct 03 12:40:23 crc kubenswrapper[4990]: I1003 12:40:23.777151 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4dn4r" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="registry-server" probeResult="failure" output=< Oct 03 12:40:23 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 12:40:23 crc kubenswrapper[4990]: > Oct 03 12:40:23 crc kubenswrapper[4990]: I1003 12:40:23.872276 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:40:23 crc kubenswrapper[4990]: E1003 12:40:23.872494 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:40:32 crc kubenswrapper[4990]: I1003 12:40:32.803711 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:40:32 crc kubenswrapper[4990]: I1003 12:40:32.897044 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:40:33 crc kubenswrapper[4990]: I1003 12:40:33.050209 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4dn4r"] Oct 03 12:40:34 crc kubenswrapper[4990]: I1003 12:40:34.557286 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4dn4r" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="registry-server" containerID="cri-o://b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd" gracePeriod=2 Oct 03 12:40:34 crc kubenswrapper[4990]: I1003 12:40:34.871860 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:40:34 crc kubenswrapper[4990]: E1003 12:40:34.872367 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.076797 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.192972 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-catalog-content\") pod \"c928e03b-f3bb-4b3c-b02c-50b62887689a\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.193255 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw9zt\" (UniqueName: \"kubernetes.io/projected/c928e03b-f3bb-4b3c-b02c-50b62887689a-kube-api-access-nw9zt\") pod \"c928e03b-f3bb-4b3c-b02c-50b62887689a\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.193675 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-utilities\") pod \"c928e03b-f3bb-4b3c-b02c-50b62887689a\" (UID: \"c928e03b-f3bb-4b3c-b02c-50b62887689a\") " Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.194341 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-utilities" (OuterVolumeSpecName: "utilities") pod "c928e03b-f3bb-4b3c-b02c-50b62887689a" (UID: "c928e03b-f3bb-4b3c-b02c-50b62887689a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.200782 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c928e03b-f3bb-4b3c-b02c-50b62887689a-kube-api-access-nw9zt" (OuterVolumeSpecName: "kube-api-access-nw9zt") pod "c928e03b-f3bb-4b3c-b02c-50b62887689a" (UID: "c928e03b-f3bb-4b3c-b02c-50b62887689a"). InnerVolumeSpecName "kube-api-access-nw9zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.253563 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c928e03b-f3bb-4b3c-b02c-50b62887689a" (UID: "c928e03b-f3bb-4b3c-b02c-50b62887689a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.295798 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.296013 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw9zt\" (UniqueName: \"kubernetes.io/projected/c928e03b-f3bb-4b3c-b02c-50b62887689a-kube-api-access-nw9zt\") on node \"crc\" DevicePath \"\"" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.296066 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c928e03b-f3bb-4b3c-b02c-50b62887689a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.572357 4990 generic.go:334] "Generic (PLEG): container finished" podID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerID="b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd" exitCode=0 Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.572422 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dn4r" event={"ID":"c928e03b-f3bb-4b3c-b02c-50b62887689a","Type":"ContainerDied","Data":"b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd"} Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.572454 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dn4r" event={"ID":"c928e03b-f3bb-4b3c-b02c-50b62887689a","Type":"ContainerDied","Data":"21bd2f99fa1dd7cc394553d58a6713f02ed4566520ee451077e61eb53a2ace94"} Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.572488 4990 scope.go:117] "RemoveContainer" containerID="b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.572697 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dn4r" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.615780 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4dn4r"] Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.616303 4990 scope.go:117] "RemoveContainer" containerID="6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.626576 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4dn4r"] Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.637256 4990 scope.go:117] "RemoveContainer" containerID="37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.717558 4990 scope.go:117] "RemoveContainer" containerID="b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd" Oct 03 12:40:35 crc kubenswrapper[4990]: E1003 12:40:35.718175 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd\": container with ID starting with b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd not found: ID does not exist" containerID="b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.718257 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd"} err="failed to get container status \"b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd\": rpc error: code = NotFound desc = could not find container \"b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd\": container with ID starting with b075c64558344b927f4377dd2c117f93793438a9bcb4315cffaaeabf03ba76fd not found: ID does not exist" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.718312 4990 scope.go:117] "RemoveContainer" containerID="6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496" Oct 03 12:40:35 crc kubenswrapper[4990]: E1003 12:40:35.719011 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496\": container with ID starting with 6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496 not found: ID does not exist" containerID="6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.719315 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496"} err="failed to get container status \"6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496\": rpc error: code = NotFound desc = could not find container \"6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496\": container with ID starting with 6a2a07a9b58080347bd0a909461e10566e81c6793d274eb280bc4b7c900d7496 not found: ID does not exist" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.719341 4990 scope.go:117] "RemoveContainer" containerID="37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca" Oct 03 12:40:35 crc kubenswrapper[4990]: E1003 12:40:35.719715 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca\": container with ID starting with 37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca not found: ID does not exist" containerID="37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca" Oct 03 12:40:35 crc kubenswrapper[4990]: I1003 12:40:35.719748 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca"} err="failed to get container status \"37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca\": rpc error: code = NotFound desc = could not find container \"37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca\": container with ID starting with 37d15611ac498e7dbc9ea0f44480a26bf13b22a78e4d70b174965e6e3ffad9ca not found: ID does not exist" Oct 03 12:40:36 crc kubenswrapper[4990]: I1003 12:40:36.891449 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" path="/var/lib/kubelet/pods/c928e03b-f3bb-4b3c-b02c-50b62887689a/volumes" Oct 03 12:40:44 crc kubenswrapper[4990]: I1003 12:40:44.670959 4990 generic.go:334] "Generic (PLEG): container finished" podID="29d82216-4aa6-4fea-822d-d9123b9586a9" containerID="b773e09589c50ba4c82c49f2e2e27e4210bab62a2f2bfcf2a272d00def146c1f" exitCode=0 Oct 03 12:40:44 crc kubenswrapper[4990]: I1003 12:40:44.671002 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" event={"ID":"29d82216-4aa6-4fea-822d-d9123b9586a9","Type":"ContainerDied","Data":"b773e09589c50ba4c82c49f2e2e27e4210bab62a2f2bfcf2a272d00def146c1f"} Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.824646 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.872366 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:40:45 crc kubenswrapper[4990]: E1003 12:40:45.872683 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.879587 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-hvtvd"] Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.888693 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-hvtvd"] Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.930594 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d82216-4aa6-4fea-822d-d9123b9586a9-host\") pod \"29d82216-4aa6-4fea-822d-d9123b9586a9\" (UID: \"29d82216-4aa6-4fea-822d-d9123b9586a9\") " Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.930698 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29d82216-4aa6-4fea-822d-d9123b9586a9-host" (OuterVolumeSpecName: "host") pod "29d82216-4aa6-4fea-822d-d9123b9586a9" (UID: "29d82216-4aa6-4fea-822d-d9123b9586a9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.931131 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9p69\" (UniqueName: \"kubernetes.io/projected/29d82216-4aa6-4fea-822d-d9123b9586a9-kube-api-access-l9p69\") pod \"29d82216-4aa6-4fea-822d-d9123b9586a9\" (UID: \"29d82216-4aa6-4fea-822d-d9123b9586a9\") " Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.933590 4990 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d82216-4aa6-4fea-822d-d9123b9586a9-host\") on node \"crc\" DevicePath \"\"" Oct 03 12:40:45 crc kubenswrapper[4990]: I1003 12:40:45.943684 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d82216-4aa6-4fea-822d-d9123b9586a9-kube-api-access-l9p69" (OuterVolumeSpecName: "kube-api-access-l9p69") pod "29d82216-4aa6-4fea-822d-d9123b9586a9" (UID: "29d82216-4aa6-4fea-822d-d9123b9586a9"). InnerVolumeSpecName "kube-api-access-l9p69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:40:46 crc kubenswrapper[4990]: I1003 12:40:46.041683 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9p69\" (UniqueName: \"kubernetes.io/projected/29d82216-4aa6-4fea-822d-d9123b9586a9-kube-api-access-l9p69\") on node \"crc\" DevicePath \"\"" Oct 03 12:40:46 crc kubenswrapper[4990]: I1003 12:40:46.690961 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7f22dcb173b40f6106fba280412027299276cbdac3b22a05dc39bb205a02a1c" Oct 03 12:40:46 crc kubenswrapper[4990]: I1003 12:40:46.691071 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-hvtvd" Oct 03 12:40:46 crc kubenswrapper[4990]: I1003 12:40:46.883995 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d82216-4aa6-4fea-822d-d9123b9586a9" path="/var/lib/kubelet/pods/29d82216-4aa6-4fea-822d-d9123b9586a9/volumes" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.073601 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-pp6mv"] Oct 03 12:40:47 crc kubenswrapper[4990]: E1003 12:40:47.074105 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="extract-content" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.074130 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="extract-content" Oct 03 12:40:47 crc kubenswrapper[4990]: E1003 12:40:47.074147 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d82216-4aa6-4fea-822d-d9123b9586a9" containerName="container-00" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.074156 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d82216-4aa6-4fea-822d-d9123b9586a9" containerName="container-00" Oct 03 12:40:47 crc kubenswrapper[4990]: E1003 12:40:47.074170 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="registry-server" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.074178 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="registry-server" Oct 03 12:40:47 crc kubenswrapper[4990]: E1003 12:40:47.074208 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="extract-utilities" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.074215 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="extract-utilities" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.074447 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d82216-4aa6-4fea-822d-d9123b9586a9" containerName="container-00" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.074491 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="c928e03b-f3bb-4b3c-b02c-50b62887689a" containerName="registry-server" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.075394 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.171153 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scgdg\" (UniqueName: \"kubernetes.io/projected/bf4c456f-e01e-4ea5-882e-b95e0775968b-kube-api-access-scgdg\") pod \"crc-debug-pp6mv\" (UID: \"bf4c456f-e01e-4ea5-882e-b95e0775968b\") " pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.171296 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf4c456f-e01e-4ea5-882e-b95e0775968b-host\") pod \"crc-debug-pp6mv\" (UID: \"bf4c456f-e01e-4ea5-882e-b95e0775968b\") " pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.272987 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scgdg\" (UniqueName: \"kubernetes.io/projected/bf4c456f-e01e-4ea5-882e-b95e0775968b-kube-api-access-scgdg\") pod \"crc-debug-pp6mv\" (UID: \"bf4c456f-e01e-4ea5-882e-b95e0775968b\") " pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.273374 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf4c456f-e01e-4ea5-882e-b95e0775968b-host\") pod \"crc-debug-pp6mv\" (UID: \"bf4c456f-e01e-4ea5-882e-b95e0775968b\") " pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.273709 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf4c456f-e01e-4ea5-882e-b95e0775968b-host\") pod \"crc-debug-pp6mv\" (UID: \"bf4c456f-e01e-4ea5-882e-b95e0775968b\") " pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.294675 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scgdg\" (UniqueName: \"kubernetes.io/projected/bf4c456f-e01e-4ea5-882e-b95e0775968b-kube-api-access-scgdg\") pod \"crc-debug-pp6mv\" (UID: \"bf4c456f-e01e-4ea5-882e-b95e0775968b\") " pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.404078 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:47 crc kubenswrapper[4990]: I1003 12:40:47.705299 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" event={"ID":"bf4c456f-e01e-4ea5-882e-b95e0775968b","Type":"ContainerStarted","Data":"44979c11967ded656599bc8c25c52dd95a2687af10a21c4cfac7ebb484590825"} Oct 03 12:40:48 crc kubenswrapper[4990]: I1003 12:40:48.716743 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" event={"ID":"bf4c456f-e01e-4ea5-882e-b95e0775968b","Type":"ContainerStarted","Data":"9f4e889b963f87f590827d7729b10df91fd5ef951f6420db633eeed9252b0798"} Oct 03 12:40:48 crc kubenswrapper[4990]: I1003 12:40:48.742198 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" podStartSLOduration=1.742158256 podStartE2EDuration="1.742158256s" podCreationTimestamp="2025-10-03 12:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 12:40:48.734381034 +0000 UTC m=+10630.531012931" watchObservedRunningTime="2025-10-03 12:40:48.742158256 +0000 UTC m=+10630.538790203" Oct 03 12:40:49 crc kubenswrapper[4990]: I1003 12:40:49.730408 4990 generic.go:334] "Generic (PLEG): container finished" podID="bf4c456f-e01e-4ea5-882e-b95e0775968b" containerID="9f4e889b963f87f590827d7729b10df91fd5ef951f6420db633eeed9252b0798" exitCode=0 Oct 03 12:40:49 crc kubenswrapper[4990]: I1003 12:40:49.730450 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" event={"ID":"bf4c456f-e01e-4ea5-882e-b95e0775968b","Type":"ContainerDied","Data":"9f4e889b963f87f590827d7729b10df91fd5ef951f6420db633eeed9252b0798"} Oct 03 12:40:50 crc kubenswrapper[4990]: I1003 12:40:50.870161 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.046022 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf4c456f-e01e-4ea5-882e-b95e0775968b-host\") pod \"bf4c456f-e01e-4ea5-882e-b95e0775968b\" (UID: \"bf4c456f-e01e-4ea5-882e-b95e0775968b\") " Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.046076 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scgdg\" (UniqueName: \"kubernetes.io/projected/bf4c456f-e01e-4ea5-882e-b95e0775968b-kube-api-access-scgdg\") pod \"bf4c456f-e01e-4ea5-882e-b95e0775968b\" (UID: \"bf4c456f-e01e-4ea5-882e-b95e0775968b\") " Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.046130 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf4c456f-e01e-4ea5-882e-b95e0775968b-host" (OuterVolumeSpecName: "host") pod "bf4c456f-e01e-4ea5-882e-b95e0775968b" (UID: "bf4c456f-e01e-4ea5-882e-b95e0775968b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.047453 4990 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf4c456f-e01e-4ea5-882e-b95e0775968b-host\") on node \"crc\" DevicePath \"\"" Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.057070 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4c456f-e01e-4ea5-882e-b95e0775968b-kube-api-access-scgdg" (OuterVolumeSpecName: "kube-api-access-scgdg") pod "bf4c456f-e01e-4ea5-882e-b95e0775968b" (UID: "bf4c456f-e01e-4ea5-882e-b95e0775968b"). InnerVolumeSpecName "kube-api-access-scgdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.149054 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scgdg\" (UniqueName: \"kubernetes.io/projected/bf4c456f-e01e-4ea5-882e-b95e0775968b-kube-api-access-scgdg\") on node \"crc\" DevicePath \"\"" Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.748453 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" event={"ID":"bf4c456f-e01e-4ea5-882e-b95e0775968b","Type":"ContainerDied","Data":"44979c11967ded656599bc8c25c52dd95a2687af10a21c4cfac7ebb484590825"} Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.748830 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44979c11967ded656599bc8c25c52dd95a2687af10a21c4cfac7ebb484590825" Oct 03 12:40:51 crc kubenswrapper[4990]: I1003 12:40:51.748530 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-pp6mv" Oct 03 12:40:57 crc kubenswrapper[4990]: I1003 12:40:57.871543 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:40:57 crc kubenswrapper[4990]: E1003 12:40:57.873649 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:41:00 crc kubenswrapper[4990]: I1003 12:41:00.530587 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-pp6mv"] Oct 03 12:41:00 crc kubenswrapper[4990]: I1003 12:41:00.548449 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-pp6mv"] Oct 03 12:41:00 crc kubenswrapper[4990]: I1003 12:41:00.889089 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf4c456f-e01e-4ea5-882e-b95e0775968b" path="/var/lib/kubelet/pods/bf4c456f-e01e-4ea5-882e-b95e0775968b/volumes" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.693759 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-7z99b"] Oct 03 12:41:01 crc kubenswrapper[4990]: E1003 12:41:01.695092 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4c456f-e01e-4ea5-882e-b95e0775968b" containerName="container-00" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.695171 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4c456f-e01e-4ea5-882e-b95e0775968b" containerName="container-00" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.695435 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4c456f-e01e-4ea5-882e-b95e0775968b" containerName="container-00" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.696256 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.752869 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-host\") pod \"crc-debug-7z99b\" (UID: \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\") " pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.753148 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldd8d\" (UniqueName: \"kubernetes.io/projected/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-kube-api-access-ldd8d\") pod \"crc-debug-7z99b\" (UID: \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\") " pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.856220 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-host\") pod \"crc-debug-7z99b\" (UID: \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\") " pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.856315 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldd8d\" (UniqueName: \"kubernetes.io/projected/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-kube-api-access-ldd8d\") pod \"crc-debug-7z99b\" (UID: \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\") " pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.856342 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-host\") pod \"crc-debug-7z99b\" (UID: \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\") " pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:01 crc kubenswrapper[4990]: I1003 12:41:01.876254 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldd8d\" (UniqueName: \"kubernetes.io/projected/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-kube-api-access-ldd8d\") pod \"crc-debug-7z99b\" (UID: \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\") " pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:02 crc kubenswrapper[4990]: I1003 12:41:02.021290 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:02 crc kubenswrapper[4990]: I1003 12:41:02.864649 4990 generic.go:334] "Generic (PLEG): container finished" podID="4ede6ea0-6a73-4963-87e6-b02474ccf9b6" containerID="208dbf40092542c0efb2312ba9f6648a7d7cbf051133d668f42b1bbcebce29a8" exitCode=0 Oct 03 12:41:02 crc kubenswrapper[4990]: I1003 12:41:02.864804 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-7z99b" event={"ID":"4ede6ea0-6a73-4963-87e6-b02474ccf9b6","Type":"ContainerDied","Data":"208dbf40092542c0efb2312ba9f6648a7d7cbf051133d668f42b1bbcebce29a8"} Oct 03 12:41:02 crc kubenswrapper[4990]: I1003 12:41:02.865325 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/crc-debug-7z99b" event={"ID":"4ede6ea0-6a73-4963-87e6-b02474ccf9b6","Type":"ContainerStarted","Data":"72273f6a9edccc55d2d38e1ca4d78ea42c40d5c07338ac6e67c6ce0da5e99e8f"} Oct 03 12:41:02 crc kubenswrapper[4990]: I1003 12:41:02.923103 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-7z99b"] Oct 03 12:41:02 crc kubenswrapper[4990]: I1003 12:41:02.937964 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ggmzc/crc-debug-7z99b"] Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.222414 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.328500 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-host\") pod \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\" (UID: \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\") " Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.328599 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldd8d\" (UniqueName: \"kubernetes.io/projected/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-kube-api-access-ldd8d\") pod \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\" (UID: \"4ede6ea0-6a73-4963-87e6-b02474ccf9b6\") " Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.328677 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-host" (OuterVolumeSpecName: "host") pod "4ede6ea0-6a73-4963-87e6-b02474ccf9b6" (UID: "4ede6ea0-6a73-4963-87e6-b02474ccf9b6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.329144 4990 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-host\") on node \"crc\" DevicePath \"\"" Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.368545 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-kube-api-access-ldd8d" (OuterVolumeSpecName: "kube-api-access-ldd8d") pod "4ede6ea0-6a73-4963-87e6-b02474ccf9b6" (UID: "4ede6ea0-6a73-4963-87e6-b02474ccf9b6"). InnerVolumeSpecName "kube-api-access-ldd8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.431059 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldd8d\" (UniqueName: \"kubernetes.io/projected/4ede6ea0-6a73-4963-87e6-b02474ccf9b6-kube-api-access-ldd8d\") on node \"crc\" DevicePath \"\"" Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.887327 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/crc-debug-7z99b" Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.887753 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ede6ea0-6a73-4963-87e6-b02474ccf9b6" path="/var/lib/kubelet/pods/4ede6ea0-6a73-4963-87e6-b02474ccf9b6/volumes" Oct 03 12:41:04 crc kubenswrapper[4990]: I1003 12:41:04.888754 4990 scope.go:117] "RemoveContainer" containerID="208dbf40092542c0efb2312ba9f6648a7d7cbf051133d668f42b1bbcebce29a8" Oct 03 12:41:05 crc kubenswrapper[4990]: I1003 12:41:05.267366 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl_51a737d8-6e16-425c-9dfc-bdcc5e8d3608/util/0.log" Oct 03 12:41:05 crc kubenswrapper[4990]: I1003 12:41:05.427248 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl_51a737d8-6e16-425c-9dfc-bdcc5e8d3608/pull/0.log" Oct 03 12:41:05 crc kubenswrapper[4990]: I1003 12:41:05.474413 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl_51a737d8-6e16-425c-9dfc-bdcc5e8d3608/util/0.log" Oct 03 12:41:05 crc kubenswrapper[4990]: I1003 12:41:05.475212 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl_51a737d8-6e16-425c-9dfc-bdcc5e8d3608/pull/0.log" Oct 03 12:41:05 crc kubenswrapper[4990]: I1003 12:41:05.638394 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl_51a737d8-6e16-425c-9dfc-bdcc5e8d3608/util/0.log" Oct 03 12:41:05 crc kubenswrapper[4990]: I1003 12:41:05.698789 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl_51a737d8-6e16-425c-9dfc-bdcc5e8d3608/extract/0.log" Oct 03 12:41:05 crc kubenswrapper[4990]: I1003 12:41:05.698838 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafehhpgl_51a737d8-6e16-425c-9dfc-bdcc5e8d3608/pull/0.log" Oct 03 12:41:05 crc kubenswrapper[4990]: I1003 12:41:05.782093 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6d6d64fdcf-t2d2q_638beb05-4965-4934-90de-4b06ff173650/kube-rbac-proxy/0.log" Oct 03 12:41:06 crc kubenswrapper[4990]: I1003 12:41:06.630854 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8686fd99f7-vhpqb_2e80486e-af20-49c4-9aba-fc7a312ed0b6/kube-rbac-proxy/0.log" Oct 03 12:41:06 crc kubenswrapper[4990]: I1003 12:41:06.639544 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8686fd99f7-vhpqb_2e80486e-af20-49c4-9aba-fc7a312ed0b6/manager/0.log" Oct 03 12:41:06 crc kubenswrapper[4990]: I1003 12:41:06.662898 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6d6d64fdcf-t2d2q_638beb05-4965-4934-90de-4b06ff173650/manager/0.log" Oct 03 12:41:06 crc kubenswrapper[4990]: I1003 12:41:06.783264 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-nls5l_7c0910c3-4982-4e25-9c19-f65aef6c1dc8/kube-rbac-proxy/0.log" Oct 03 12:41:06 crc kubenswrapper[4990]: I1003 12:41:06.837326 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-nls5l_7c0910c3-4982-4e25-9c19-f65aef6c1dc8/manager/0.log" Oct 03 12:41:06 crc kubenswrapper[4990]: I1003 12:41:06.930572 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-d785ddfd5-h88gm_466406f5-5f91-433d-859e-966713b4d752/kube-rbac-proxy/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.096196 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5ffbdb7ddf-fcvl4_d863cc9f-06fe-4045-bb72-f2d3c36d14f3/kube-rbac-proxy/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.173060 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5ffbdb7ddf-fcvl4_d863cc9f-06fe-4045-bb72-f2d3c36d14f3/manager/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.173618 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-d785ddfd5-h88gm_466406f5-5f91-433d-859e-966713b4d752/manager/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.296803 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-586b66cf4f-llmqm_a6826d26-170b-42c6-a519-599c8873c53a/kube-rbac-proxy/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.351391 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-586b66cf4f-llmqm_a6826d26-170b-42c6-a519-599c8873c53a/manager/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.419311 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7c9978f67-vhtvr_0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e/kube-rbac-proxy/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.573956 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-59b5fc9845-8vxtx_cac8d3c8-ac83-4af9-a285-4020d5978c74/kube-rbac-proxy/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.637677 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-59b5fc9845-8vxtx_cac8d3c8-ac83-4af9-a285-4020d5978c74/manager/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.749384 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c9969c6c6-hwbjj_d97d66ac-8e4e-414b-bad7-1f4323ee7ac5/kube-rbac-proxy/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.821465 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7c9978f67-vhtvr_0f7a4cab-e671-4dd4-a1b7-fdecc0814e4e/manager/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.906499 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c9969c6c6-hwbjj_d97d66ac-8e4e-414b-bad7-1f4323ee7ac5/manager/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.928268 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-66fdd975d9-nc6dd_28afe97f-598e-4e6f-95b5-3e72e3082504/kube-rbac-proxy/0.log" Oct 03 12:41:07 crc kubenswrapper[4990]: I1003 12:41:07.961266 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-66fdd975d9-nc6dd_28afe97f-598e-4e6f-95b5-3e72e3082504/manager/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.056560 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-696ff4bcdd-m6x7p_8c8ec33a-ded3-49b2-9f80-41f52b2d2833/kube-rbac-proxy/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.161448 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-549fb68678-d2sm6_aec15413-8ac4-4d63-a4d3-f28754800047/kube-rbac-proxy/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.165801 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-696ff4bcdd-m6x7p_8c8ec33a-ded3-49b2-9f80-41f52b2d2833/manager/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.299278 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5b45478b88-r7r9l_568468c9-631c-4cb7-bdcb-a6a6696c23f7/kube-rbac-proxy/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.304789 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-549fb68678-d2sm6_aec15413-8ac4-4d63-a4d3-f28754800047/manager/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.500734 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-b4444585c-rhx7t_e76b0ac3-f0b9-4a88-b495-c6430a9568fe/kube-rbac-proxy/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.592312 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5b45478b88-r7r9l_568468c9-631c-4cb7-bdcb-a6a6696c23f7/manager/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.629308 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-b4444585c-rhx7t_e76b0ac3-f0b9-4a88-b495-c6430a9568fe/manager/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.675805 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2_6683b324-a42c-419b-85a6-35386e0016bb/manager/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.693374 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7fb4f565cd94rr2_6683b324-a42c-419b-85a6-35386e0016bb/kube-rbac-proxy/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.821169 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54df7874c5-glg9l_d9fab51c-9ec1-4099-9dd5-0177bb096874/kube-rbac-proxy/0.log" Oct 03 12:41:08 crc kubenswrapper[4990]: I1003 12:41:08.890674 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-86f8d7b75f-zmfd7_d55415e3-b106-4fb0-8df5-c8a268332331/kube-rbac-proxy/0.log" Oct 03 12:41:09 crc kubenswrapper[4990]: I1003 12:41:09.238768 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-86f8d7b75f-zmfd7_d55415e3-b106-4fb0-8df5-c8a268332331/operator/0.log" Oct 03 12:41:09 crc kubenswrapper[4990]: I1003 12:41:09.262369 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-855d7949fc-qljt8_2143aa33-b534-4237-b15b-e63c30a4672d/kube-rbac-proxy/0.log" Oct 03 12:41:09 crc kubenswrapper[4990]: I1003 12:41:09.425619 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rdszf_022e23c3-ee2b-46c0-a326-7cc338bd845e/registry-server/0.log" Oct 03 12:41:09 crc kubenswrapper[4990]: I1003 12:41:09.449090 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-ccbfcb8c-8dtdl_66d8cf5e-b670-4023-8e90-de7accf71f47/kube-rbac-proxy/0.log" Oct 03 12:41:09 crc kubenswrapper[4990]: I1003 12:41:09.475793 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-855d7949fc-qljt8_2143aa33-b534-4237-b15b-e63c30a4672d/manager/0.log" Oct 03 12:41:09 crc kubenswrapper[4990]: I1003 12:41:09.717712 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-ccbfcb8c-8dtdl_66d8cf5e-b670-4023-8e90-de7accf71f47/manager/0.log" Oct 03 12:41:09 crc kubenswrapper[4990]: I1003 12:41:09.728126 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-frzpd_fc10a0dc-8a4d-4c45-abeb-376dc7344f48/operator/0.log" Oct 03 12:41:09 crc kubenswrapper[4990]: I1003 12:41:09.848434 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-2j7gl_5e6f315d-894b-4bcc-89fa-4cfa08e9cf88/kube-rbac-proxy/0.log" Oct 03 12:41:10 crc kubenswrapper[4990]: I1003 12:41:10.011954 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5ffb97cddf-cwfzj_8596d634-51fe-4c96-a347-4d99b57bb821/kube-rbac-proxy/0.log" Oct 03 12:41:10 crc kubenswrapper[4990]: I1003 12:41:10.049464 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-2j7gl_5e6f315d-894b-4bcc-89fa-4cfa08e9cf88/manager/0.log" Oct 03 12:41:10 crc kubenswrapper[4990]: I1003 12:41:10.210755 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-6c9g6_669348d2-73cb-4efe-93b1-95f08fe54e82/kube-rbac-proxy/0.log" Oct 03 12:41:10 crc kubenswrapper[4990]: I1003 12:41:10.271940 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-6c9g6_669348d2-73cb-4efe-93b1-95f08fe54e82/manager/0.log" Oct 03 12:41:10 crc kubenswrapper[4990]: I1003 12:41:10.442636 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5595cf6c95-77swg_b3964048-aaf9-4f7b-ba1b-4dba06b0f702/kube-rbac-proxy/0.log" Oct 03 12:41:10 crc kubenswrapper[4990]: I1003 12:41:10.636649 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5595cf6c95-77swg_b3964048-aaf9-4f7b-ba1b-4dba06b0f702/manager/0.log" Oct 03 12:41:10 crc kubenswrapper[4990]: I1003 12:41:10.681947 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5ffb97cddf-cwfzj_8596d634-51fe-4c96-a347-4d99b57bb821/manager/0.log" Oct 03 12:41:10 crc kubenswrapper[4990]: I1003 12:41:10.872150 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:41:10 crc kubenswrapper[4990]: E1003 12:41:10.872397 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:41:12 crc kubenswrapper[4990]: I1003 12:41:12.529355 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54df7874c5-glg9l_d9fab51c-9ec1-4099-9dd5-0177bb096874/manager/0.log" Oct 03 12:41:24 crc kubenswrapper[4990]: I1003 12:41:24.876204 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:41:24 crc kubenswrapper[4990]: E1003 12:41:24.876883 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:41:28 crc kubenswrapper[4990]: I1003 12:41:28.372449 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-r8p2v_a42cdb61-90ff-40c9-8b18-95b86f1dfc3a/control-plane-machine-set-operator/0.log" Oct 03 12:41:28 crc kubenswrapper[4990]: I1003 12:41:28.524196 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9k9dv_bc74c56c-b4ba-479b-87ba-ba707c62af66/kube-rbac-proxy/0.log" Oct 03 12:41:28 crc kubenswrapper[4990]: I1003 12:41:28.632057 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9k9dv_bc74c56c-b4ba-479b-87ba-ba707c62af66/machine-api-operator/0.log" Oct 03 12:41:38 crc kubenswrapper[4990]: I1003 12:41:38.889550 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:41:38 crc kubenswrapper[4990]: E1003 12:41:38.890250 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:41:42 crc kubenswrapper[4990]: I1003 12:41:42.605240 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-gp2bd_b6d0cdc1-a2f0-45a9-89d4-81903991fcd2/cert-manager-controller/0.log" Oct 03 12:41:42 crc kubenswrapper[4990]: I1003 12:41:42.632341 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-j9hm6_a9b8600d-5569-4df7-ac48-0925c3d9c0b9/cert-manager-cainjector/0.log" Oct 03 12:41:43 crc kubenswrapper[4990]: I1003 12:41:43.101715 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-6ddvk_e0a67b45-9f3c-4dac-ab06-c740fa33ee97/cert-manager-webhook/0.log" Oct 03 12:41:51 crc kubenswrapper[4990]: I1003 12:41:51.872740 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:41:51 crc kubenswrapper[4990]: E1003 12:41:51.874997 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:41:56 crc kubenswrapper[4990]: I1003 12:41:56.148350 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-fb92b_737d1d1e-6e9c-4d10-b3b9-101bd4b3e440/nmstate-console-plugin/0.log" Oct 03 12:41:56 crc kubenswrapper[4990]: I1003 12:41:56.935964 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ll6ss_f9c86541-a05e-4cee-9a09-2466ca8d588e/nmstate-handler/0.log" Oct 03 12:41:56 crc kubenswrapper[4990]: I1003 12:41:56.977472 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pn5fp_15431b47-cf1b-425a-807a-7239e2947fff/nmstate-metrics/0.log" Oct 03 12:41:57 crc kubenswrapper[4990]: I1003 12:41:57.023167 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-pn5fp_15431b47-cf1b-425a-807a-7239e2947fff/kube-rbac-proxy/0.log" Oct 03 12:41:57 crc kubenswrapper[4990]: I1003 12:41:57.183629 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-4zf9r_b89c7007-cfc4-42f5-8b2d-024b7d364d54/nmstate-operator/0.log" Oct 03 12:41:57 crc kubenswrapper[4990]: I1003 12:41:57.200103 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-l94bb_277e4ee9-1a86-464c-b4b8-9dcb1a17e1df/nmstate-webhook/0.log" Oct 03 12:42:02 crc kubenswrapper[4990]: I1003 12:42:02.872492 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:42:02 crc kubenswrapper[4990]: E1003 12:42:02.873290 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:42:11 crc kubenswrapper[4990]: I1003 12:42:11.538054 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-pjbnl_d4754a5f-654a-40cd-999f-355a60ad887b/kube-rbac-proxy/0.log" Oct 03 12:42:11 crc kubenswrapper[4990]: I1003 12:42:11.724372 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-frr-files/0.log" Oct 03 12:42:11 crc kubenswrapper[4990]: I1003 12:42:11.956943 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-pjbnl_d4754a5f-654a-40cd-999f-355a60ad887b/controller/0.log" Oct 03 12:42:11 crc kubenswrapper[4990]: I1003 12:42:11.963704 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-frr-files/0.log" Oct 03 12:42:11 crc kubenswrapper[4990]: I1003 12:42:11.973676 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-metrics/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.008267 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-reloader/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.130113 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-reloader/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.284354 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-reloader/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.379600 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-frr-files/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.445473 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-metrics/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.531241 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-metrics/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.961504 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-frr-files/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.965159 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-metrics/0.log" Oct 03 12:42:12 crc kubenswrapper[4990]: I1003 12:42:12.969442 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/cp-reloader/0.log" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.065847 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/controller/0.log" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.208744 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/kube-rbac-proxy/0.log" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.226436 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/frr-metrics/0.log" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.324722 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/kube-rbac-proxy-frr/0.log" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.506615 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/reloader/0.log" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.570003 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-m6s44_c3e8a922-2054-46e4-b45e-4c525e08f72f/frr-k8s-webhook-server/0.log" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.794638 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-67bf5f5c84-pddct_16ec1111-996b-4343-a1dd-8111dc1ee20f/manager/0.log" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.871908 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:42:13 crc kubenswrapper[4990]: E1003 12:42:13.872456 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:42:13 crc kubenswrapper[4990]: I1003 12:42:13.986178 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c6bfff67b-b9rrg_36b8b036-fee3-4797-b99b-81ebe6f5c659/webhook-server/0.log" Oct 03 12:42:14 crc kubenswrapper[4990]: I1003 12:42:14.108711 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jr4nb_031a0646-6ed0-467a-8042-5e0096fb631d/kube-rbac-proxy/0.log" Oct 03 12:42:15 crc kubenswrapper[4990]: I1003 12:42:15.140606 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jr4nb_031a0646-6ed0-467a-8042-5e0096fb631d/speaker/0.log" Oct 03 12:42:16 crc kubenswrapper[4990]: I1003 12:42:16.668261 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qprvb_9bc795a9-e956-4de9-b9e8-98a20a7880af/frr/0.log" Oct 03 12:42:24 crc kubenswrapper[4990]: I1003 12:42:24.871941 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:42:24 crc kubenswrapper[4990]: E1003 12:42:24.874203 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:42:29 crc kubenswrapper[4990]: I1003 12:42:29.954085 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2_86384ae4-ecae-44dc-9ea7-80a4f19ed2c3/util/0.log" Oct 03 12:42:30 crc kubenswrapper[4990]: I1003 12:42:30.098952 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2_86384ae4-ecae-44dc-9ea7-80a4f19ed2c3/pull/0.log" Oct 03 12:42:30 crc kubenswrapper[4990]: I1003 12:42:30.132561 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2_86384ae4-ecae-44dc-9ea7-80a4f19ed2c3/util/0.log" Oct 03 12:42:30 crc kubenswrapper[4990]: I1003 12:42:30.160457 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2_86384ae4-ecae-44dc-9ea7-80a4f19ed2c3/pull/0.log" Oct 03 12:42:30 crc kubenswrapper[4990]: I1003 12:42:30.349397 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2_86384ae4-ecae-44dc-9ea7-80a4f19ed2c3/util/0.log" Oct 03 12:42:30 crc kubenswrapper[4990]: I1003 12:42:30.350384 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2_86384ae4-ecae-44dc-9ea7-80a4f19ed2c3/pull/0.log" Oct 03 12:42:30 crc kubenswrapper[4990]: I1003 12:42:30.381822 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69xqlm2_86384ae4-ecae-44dc-9ea7-80a4f19ed2c3/extract/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.174937 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j_b86cb1d7-8858-480b-8f13-29f9d3d6a5e9/util/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.360127 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j_b86cb1d7-8858-480b-8f13-29f9d3d6a5e9/util/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.385286 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j_b86cb1d7-8858-480b-8f13-29f9d3d6a5e9/pull/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.393325 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j_b86cb1d7-8858-480b-8f13-29f9d3d6a5e9/pull/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.568571 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j_b86cb1d7-8858-480b-8f13-29f9d3d6a5e9/util/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.568627 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j_b86cb1d7-8858-480b-8f13-29f9d3d6a5e9/pull/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.576038 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l2l9j_b86cb1d7-8858-480b-8f13-29f9d3d6a5e9/extract/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.719814 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9_d0891d41-d167-40c2-a9a0-5a44a84f8a71/util/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.924498 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9_d0891d41-d167-40c2-a9a0-5a44a84f8a71/pull/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.934187 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9_d0891d41-d167-40c2-a9a0-5a44a84f8a71/pull/0.log" Oct 03 12:42:31 crc kubenswrapper[4990]: I1003 12:42:31.956194 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9_d0891d41-d167-40c2-a9a0-5a44a84f8a71/util/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.170267 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9_d0891d41-d167-40c2-a9a0-5a44a84f8a71/util/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.176750 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9_d0891d41-d167-40c2-a9a0-5a44a84f8a71/extract/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.186650 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2djrfw9_d0891d41-d167-40c2-a9a0-5a44a84f8a71/pull/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.352210 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gn5hg_3336d35c-b95d-4fbf-90e4-de018ff26f36/extract-utilities/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.496592 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gn5hg_3336d35c-b95d-4fbf-90e4-de018ff26f36/extract-utilities/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.521664 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gn5hg_3336d35c-b95d-4fbf-90e4-de018ff26f36/extract-content/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.554833 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gn5hg_3336d35c-b95d-4fbf-90e4-de018ff26f36/extract-content/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.697064 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gn5hg_3336d35c-b95d-4fbf-90e4-de018ff26f36/extract-utilities/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.729403 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gn5hg_3336d35c-b95d-4fbf-90e4-de018ff26f36/extract-content/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.795083 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l2brd_317eadb4-b7e8-41c2-b7e8-a91b0d4719d1/extract-utilities/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.937627 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gn5hg_3336d35c-b95d-4fbf-90e4-de018ff26f36/registry-server/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.966492 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l2brd_317eadb4-b7e8-41c2-b7e8-a91b0d4719d1/extract-utilities/0.log" Oct 03 12:42:32 crc kubenswrapper[4990]: I1003 12:42:32.993678 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l2brd_317eadb4-b7e8-41c2-b7e8-a91b0d4719d1/extract-content/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.057446 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l2brd_317eadb4-b7e8-41c2-b7e8-a91b0d4719d1/extract-content/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.239406 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l2brd_317eadb4-b7e8-41c2-b7e8-a91b0d4719d1/extract-content/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.259185 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l2brd_317eadb4-b7e8-41c2-b7e8-a91b0d4719d1/extract-utilities/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.339640 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv_f30d9093-2fce-4e5d-a711-4c746b501b60/util/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.513495 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv_f30d9093-2fce-4e5d-a711-4c746b501b60/util/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.557410 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv_f30d9093-2fce-4e5d-a711-4c746b501b60/pull/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.570035 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv_f30d9093-2fce-4e5d-a711-4c746b501b60/pull/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.776158 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv_f30d9093-2fce-4e5d-a711-4c746b501b60/util/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.776916 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv_f30d9093-2fce-4e5d-a711-4c746b501b60/extract/0.log" Oct 03 12:42:33 crc kubenswrapper[4990]: I1003 12:42:33.808826 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cvhrwv_f30d9093-2fce-4e5d-a711-4c746b501b60/pull/0.log" Oct 03 12:42:34 crc kubenswrapper[4990]: I1003 12:42:34.000031 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jx4gz_3f649d78-ed82-4728-99ff-cf70e4d53756/marketplace-operator/0.log" Oct 03 12:42:34 crc kubenswrapper[4990]: I1003 12:42:34.005862 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8wnd4_d0d5f333-6e56-46b8-9d2d-031053c96543/extract-utilities/0.log" Oct 03 12:42:34 crc kubenswrapper[4990]: I1003 12:42:34.277802 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8wnd4_d0d5f333-6e56-46b8-9d2d-031053c96543/extract-content/0.log" Oct 03 12:42:34 crc kubenswrapper[4990]: I1003 12:42:34.302764 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8wnd4_d0d5f333-6e56-46b8-9d2d-031053c96543/extract-utilities/0.log" Oct 03 12:42:34 crc kubenswrapper[4990]: I1003 12:42:34.427941 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8wnd4_d0d5f333-6e56-46b8-9d2d-031053c96543/extract-content/0.log" Oct 03 12:42:34 crc kubenswrapper[4990]: I1003 12:42:34.533966 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8wnd4_d0d5f333-6e56-46b8-9d2d-031053c96543/extract-utilities/0.log" Oct 03 12:42:34 crc kubenswrapper[4990]: I1003 12:42:34.556622 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8wnd4_d0d5f333-6e56-46b8-9d2d-031053c96543/extract-content/0.log" Oct 03 12:42:34 crc kubenswrapper[4990]: I1003 12:42:34.844104 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6jsd_26f2805e-e269-4ee9-8e83-2513cea31add/extract-utilities/0.log" Oct 03 12:42:35 crc kubenswrapper[4990]: I1003 12:42:35.078355 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8wnd4_d0d5f333-6e56-46b8-9d2d-031053c96543/registry-server/0.log" Oct 03 12:42:35 crc kubenswrapper[4990]: I1003 12:42:35.082476 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l2brd_317eadb4-b7e8-41c2-b7e8-a91b0d4719d1/registry-server/0.log" Oct 03 12:42:35 crc kubenswrapper[4990]: I1003 12:42:35.156833 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6jsd_26f2805e-e269-4ee9-8e83-2513cea31add/extract-content/0.log" Oct 03 12:42:35 crc kubenswrapper[4990]: I1003 12:42:35.244307 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6jsd_26f2805e-e269-4ee9-8e83-2513cea31add/extract-utilities/0.log" Oct 03 12:42:35 crc kubenswrapper[4990]: I1003 12:42:35.300123 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6jsd_26f2805e-e269-4ee9-8e83-2513cea31add/extract-content/0.log" Oct 03 12:42:35 crc kubenswrapper[4990]: I1003 12:42:35.437915 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6jsd_26f2805e-e269-4ee9-8e83-2513cea31add/extract-content/0.log" Oct 03 12:42:35 crc kubenswrapper[4990]: I1003 12:42:35.490716 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6jsd_26f2805e-e269-4ee9-8e83-2513cea31add/extract-utilities/0.log" Oct 03 12:42:36 crc kubenswrapper[4990]: I1003 12:42:36.632031 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6jsd_26f2805e-e269-4ee9-8e83-2513cea31add/registry-server/0.log" Oct 03 12:42:38 crc kubenswrapper[4990]: I1003 12:42:38.879666 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:42:38 crc kubenswrapper[4990]: E1003 12:42:38.880538 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:42:49 crc kubenswrapper[4990]: I1003 12:42:49.196640 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-kn4p6_14dc4e2b-0ed4-4e8c-8e4c-0a4c7162272b/prometheus-operator/0.log" Oct 03 12:42:49 crc kubenswrapper[4990]: I1003 12:42:49.356293 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5c5cf4864b-5n4wn_1db65103-4b9c-4ef5-8d8c-f12e4c697871/prometheus-operator-admission-webhook/0.log" Oct 03 12:42:49 crc kubenswrapper[4990]: I1003 12:42:49.462181 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5c5cf4864b-g9nk7_03fd91e9-b108-4b44-8ded-67dc4bc47e98/prometheus-operator-admission-webhook/0.log" Oct 03 12:42:49 crc kubenswrapper[4990]: I1003 12:42:49.559986 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-62fwf_ecd65876-60c2-45ca-84e9-51a9acb8b6e8/operator/0.log" Oct 03 12:42:49 crc kubenswrapper[4990]: I1003 12:42:49.680228 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-68df8_b61f14d1-ad23-48c6-a6a3-bb45373e5128/perses-operator/0.log" Oct 03 12:42:51 crc kubenswrapper[4990]: I1003 12:42:51.871680 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:42:51 crc kubenswrapper[4990]: E1003 12:42:51.872649 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:43:02 crc kubenswrapper[4990]: I1003 12:43:02.872302 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:43:02 crc kubenswrapper[4990]: E1003 12:43:02.873138 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:43:13 crc kubenswrapper[4990]: I1003 12:43:13.871696 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:43:13 crc kubenswrapper[4990]: E1003 12:43:13.872497 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.310339 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q428m"] Oct 03 12:43:15 crc kubenswrapper[4990]: E1003 12:43:15.311083 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ede6ea0-6a73-4963-87e6-b02474ccf9b6" containerName="container-00" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.311096 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ede6ea0-6a73-4963-87e6-b02474ccf9b6" containerName="container-00" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.311336 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ede6ea0-6a73-4963-87e6-b02474ccf9b6" containerName="container-00" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.312962 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.323486 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q428m"] Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.487916 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdqh\" (UniqueName: \"kubernetes.io/projected/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-kube-api-access-mfdqh\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.488094 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-catalog-content\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.488123 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-utilities\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.590132 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-utilities\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.590291 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdqh\" (UniqueName: \"kubernetes.io/projected/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-kube-api-access-mfdqh\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.590438 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-catalog-content\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.591074 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-catalog-content\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.591123 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-utilities\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.608192 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdqh\" (UniqueName: \"kubernetes.io/projected/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-kube-api-access-mfdqh\") pod \"redhat-operators-q428m\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:15 crc kubenswrapper[4990]: I1003 12:43:15.641091 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:16 crc kubenswrapper[4990]: I1003 12:43:16.125242 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q428m"] Oct 03 12:43:16 crc kubenswrapper[4990]: W1003 12:43:16.133560 4990 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffedccfc_38be_4c60_b4a1_d304fb87a8fa.slice/crio-a086e81df6be1a56b8a0e9ccdc073c4649552ad0fe7f90e56c246ea665c829d3 WatchSource:0}: Error finding container a086e81df6be1a56b8a0e9ccdc073c4649552ad0fe7f90e56c246ea665c829d3: Status 404 returned error can't find the container with id a086e81df6be1a56b8a0e9ccdc073c4649552ad0fe7f90e56c246ea665c829d3 Oct 03 12:43:16 crc kubenswrapper[4990]: I1003 12:43:16.416389 4990 generic.go:334] "Generic (PLEG): container finished" podID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerID="c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f" exitCode=0 Oct 03 12:43:16 crc kubenswrapper[4990]: I1003 12:43:16.416429 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q428m" event={"ID":"ffedccfc-38be-4c60-b4a1-d304fb87a8fa","Type":"ContainerDied","Data":"c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f"} Oct 03 12:43:16 crc kubenswrapper[4990]: I1003 12:43:16.416455 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q428m" event={"ID":"ffedccfc-38be-4c60-b4a1-d304fb87a8fa","Type":"ContainerStarted","Data":"a086e81df6be1a56b8a0e9ccdc073c4649552ad0fe7f90e56c246ea665c829d3"} Oct 03 12:43:16 crc kubenswrapper[4990]: I1003 12:43:16.418971 4990 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 12:43:17 crc kubenswrapper[4990]: I1003 12:43:17.430664 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q428m" event={"ID":"ffedccfc-38be-4c60-b4a1-d304fb87a8fa","Type":"ContainerStarted","Data":"0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68"} Oct 03 12:43:20 crc kubenswrapper[4990]: I1003 12:43:20.489911 4990 generic.go:334] "Generic (PLEG): container finished" podID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerID="0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68" exitCode=0 Oct 03 12:43:20 crc kubenswrapper[4990]: I1003 12:43:20.490012 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q428m" event={"ID":"ffedccfc-38be-4c60-b4a1-d304fb87a8fa","Type":"ContainerDied","Data":"0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68"} Oct 03 12:43:21 crc kubenswrapper[4990]: I1003 12:43:21.517600 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q428m" event={"ID":"ffedccfc-38be-4c60-b4a1-d304fb87a8fa","Type":"ContainerStarted","Data":"c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9"} Oct 03 12:43:21 crc kubenswrapper[4990]: I1003 12:43:21.547263 4990 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q428m" podStartSLOduration=1.905710501 podStartE2EDuration="6.547245117s" podCreationTimestamp="2025-10-03 12:43:15 +0000 UTC" firstStartedPulling="2025-10-03 12:43:16.41866468 +0000 UTC m=+10778.215296557" lastFinishedPulling="2025-10-03 12:43:21.060199326 +0000 UTC m=+10782.856831173" observedRunningTime="2025-10-03 12:43:21.539496727 +0000 UTC m=+10783.336128594" watchObservedRunningTime="2025-10-03 12:43:21.547245117 +0000 UTC m=+10783.343876984" Oct 03 12:43:25 crc kubenswrapper[4990]: I1003 12:43:25.641798 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:25 crc kubenswrapper[4990]: I1003 12:43:25.643413 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:26 crc kubenswrapper[4990]: I1003 12:43:26.715010 4990 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q428m" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="registry-server" probeResult="failure" output=< Oct 03 12:43:26 crc kubenswrapper[4990]: timeout: failed to connect service ":50051" within 1s Oct 03 12:43:26 crc kubenswrapper[4990]: > Oct 03 12:43:26 crc kubenswrapper[4990]: I1003 12:43:26.872641 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:43:26 crc kubenswrapper[4990]: E1003 12:43:26.873115 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:43:35 crc kubenswrapper[4990]: I1003 12:43:35.733942 4990 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:35 crc kubenswrapper[4990]: I1003 12:43:35.802772 4990 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:35 crc kubenswrapper[4990]: I1003 12:43:35.985795 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q428m"] Oct 03 12:43:37 crc kubenswrapper[4990]: I1003 12:43:37.701048 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q428m" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="registry-server" containerID="cri-o://c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9" gracePeriod=2 Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.244058 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.360717 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-catalog-content\") pod \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.361126 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-utilities\") pod \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.361169 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfdqh\" (UniqueName: \"kubernetes.io/projected/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-kube-api-access-mfdqh\") pod \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\" (UID: \"ffedccfc-38be-4c60-b4a1-d304fb87a8fa\") " Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.361926 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-utilities" (OuterVolumeSpecName: "utilities") pod "ffedccfc-38be-4c60-b4a1-d304fb87a8fa" (UID: "ffedccfc-38be-4c60-b4a1-d304fb87a8fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.367700 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-kube-api-access-mfdqh" (OuterVolumeSpecName: "kube-api-access-mfdqh") pod "ffedccfc-38be-4c60-b4a1-d304fb87a8fa" (UID: "ffedccfc-38be-4c60-b4a1-d304fb87a8fa"). InnerVolumeSpecName "kube-api-access-mfdqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.442132 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffedccfc-38be-4c60-b4a1-d304fb87a8fa" (UID: "ffedccfc-38be-4c60-b4a1-d304fb87a8fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.464058 4990 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.464086 4990 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.464097 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfdqh\" (UniqueName: \"kubernetes.io/projected/ffedccfc-38be-4c60-b4a1-d304fb87a8fa-kube-api-access-mfdqh\") on node \"crc\" DevicePath \"\"" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.712157 4990 generic.go:334] "Generic (PLEG): container finished" podID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerID="c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9" exitCode=0 Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.712224 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q428m" event={"ID":"ffedccfc-38be-4c60-b4a1-d304fb87a8fa","Type":"ContainerDied","Data":"c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9"} Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.713831 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q428m" event={"ID":"ffedccfc-38be-4c60-b4a1-d304fb87a8fa","Type":"ContainerDied","Data":"a086e81df6be1a56b8a0e9ccdc073c4649552ad0fe7f90e56c246ea665c829d3"} Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.712408 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q428m" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.713904 4990 scope.go:117] "RemoveContainer" containerID="c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.734635 4990 scope.go:117] "RemoveContainer" containerID="0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.758434 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q428m"] Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.775266 4990 scope.go:117] "RemoveContainer" containerID="c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.776896 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q428m"] Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.819491 4990 scope.go:117] "RemoveContainer" containerID="c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9" Oct 03 12:43:38 crc kubenswrapper[4990]: E1003 12:43:38.819945 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9\": container with ID starting with c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9 not found: ID does not exist" containerID="c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.819993 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9"} err="failed to get container status \"c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9\": rpc error: code = NotFound desc = could not find container \"c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9\": container with ID starting with c711176964846eb46f8f6b334d5c2af79449ba7c5985f91764e94f80a6b215b9 not found: ID does not exist" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.820025 4990 scope.go:117] "RemoveContainer" containerID="0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68" Oct 03 12:43:38 crc kubenswrapper[4990]: E1003 12:43:38.820364 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68\": container with ID starting with 0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68 not found: ID does not exist" containerID="0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.820406 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68"} err="failed to get container status \"0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68\": rpc error: code = NotFound desc = could not find container \"0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68\": container with ID starting with 0e95f9797f70094971d9fd6d4d2b962c302e3159998a2a0ef984a9fbe5d92d68 not found: ID does not exist" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.820433 4990 scope.go:117] "RemoveContainer" containerID="c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f" Oct 03 12:43:38 crc kubenswrapper[4990]: E1003 12:43:38.820775 4990 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f\": container with ID starting with c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f not found: ID does not exist" containerID="c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.820801 4990 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f"} err="failed to get container status \"c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f\": rpc error: code = NotFound desc = could not find container \"c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f\": container with ID starting with c0c047bd00db5bec83cdbaecf4117d9a4e7a022e8eb9f060e5c66a53278af44f not found: ID does not exist" Oct 03 12:43:38 crc kubenswrapper[4990]: I1003 12:43:38.890090 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" path="/var/lib/kubelet/pods/ffedccfc-38be-4c60-b4a1-d304fb87a8fa/volumes" Oct 03 12:43:39 crc kubenswrapper[4990]: I1003 12:43:39.872819 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:43:39 crc kubenswrapper[4990]: E1003 12:43:39.873496 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:43:51 crc kubenswrapper[4990]: I1003 12:43:51.871488 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:43:51 crc kubenswrapper[4990]: E1003 12:43:51.873237 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:43:56 crc kubenswrapper[4990]: I1003 12:43:56.369832 4990 scope.go:117] "RemoveContainer" containerID="b773e09589c50ba4c82c49f2e2e27e4210bab62a2f2bfcf2a272d00def146c1f" Oct 03 12:44:04 crc kubenswrapper[4990]: I1003 12:44:04.879021 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:44:04 crc kubenswrapper[4990]: E1003 12:44:04.879902 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:44:15 crc kubenswrapper[4990]: I1003 12:44:15.872059 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:44:15 crc kubenswrapper[4990]: E1003 12:44:15.874325 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:44:27 crc kubenswrapper[4990]: I1003 12:44:27.872445 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:44:27 crc kubenswrapper[4990]: E1003 12:44:27.873276 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:44:42 crc kubenswrapper[4990]: I1003 12:44:42.872663 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:44:42 crc kubenswrapper[4990]: E1003 12:44:42.873700 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:44:54 crc kubenswrapper[4990]: I1003 12:44:54.872606 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:44:54 crc kubenswrapper[4990]: E1003 12:44:54.873919 4990 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-68v62_openshift-machine-config-operator(f21ea38c-26da-4987-a50d-bafecdfbbd02)\"" pod="openshift-machine-config-operator/machine-config-daemon-68v62" podUID="f21ea38c-26da-4987-a50d-bafecdfbbd02" Oct 03 12:44:56 crc kubenswrapper[4990]: I1003 12:44:56.533787 4990 scope.go:117] "RemoveContainer" containerID="dbec29cd5511114c735b22e90c9e48d39447fa1bd75fe9d6701150badd0a1332" Oct 03 12:44:56 crc kubenswrapper[4990]: I1003 12:44:56.571365 4990 scope.go:117] "RemoveContainer" containerID="69ead2eee8be9c7d1c4c8514460b223b253ec758ee3b2459fa692a7736141406" Oct 03 12:44:56 crc kubenswrapper[4990]: I1003 12:44:56.613925 4990 scope.go:117] "RemoveContainer" containerID="ee4f2e4a9bc59901ab77818ca38dce61d7b8ba70c552bd2fbba2fe3edaec81cc" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.160538 4990 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4"] Oct 03 12:45:00 crc kubenswrapper[4990]: E1003 12:45:00.161590 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="registry-server" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.161605 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="registry-server" Oct 03 12:45:00 crc kubenswrapper[4990]: E1003 12:45:00.161630 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="extract-utilities" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.161636 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="extract-utilities" Oct 03 12:45:00 crc kubenswrapper[4990]: E1003 12:45:00.161672 4990 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="extract-content" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.161678 4990 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="extract-content" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.161946 4990 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffedccfc-38be-4c60-b4a1-d304fb87a8fa" containerName="registry-server" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.162800 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.167149 4990 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.167393 4990 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.174102 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4"] Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.262255 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfmsl\" (UniqueName: \"kubernetes.io/projected/fa58b90d-e060-451e-8cc6-6a6d63d663dd-kube-api-access-zfmsl\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.262564 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa58b90d-e060-451e-8cc6-6a6d63d663dd-config-volume\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.262739 4990 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa58b90d-e060-451e-8cc6-6a6d63d663dd-secret-volume\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.366389 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfmsl\" (UniqueName: \"kubernetes.io/projected/fa58b90d-e060-451e-8cc6-6a6d63d663dd-kube-api-access-zfmsl\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.366447 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa58b90d-e060-451e-8cc6-6a6d63d663dd-config-volume\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.366661 4990 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa58b90d-e060-451e-8cc6-6a6d63d663dd-secret-volume\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.369966 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa58b90d-e060-451e-8cc6-6a6d63d663dd-config-volume\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.390037 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa58b90d-e060-451e-8cc6-6a6d63d663dd-secret-volume\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.392951 4990 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfmsl\" (UniqueName: \"kubernetes.io/projected/fa58b90d-e060-451e-8cc6-6a6d63d663dd-kube-api-access-zfmsl\") pod \"collect-profiles-29324925-jqpj4\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:00 crc kubenswrapper[4990]: I1003 12:45:00.535096 4990 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:01 crc kubenswrapper[4990]: I1003 12:45:01.044651 4990 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4"] Oct 03 12:45:01 crc kubenswrapper[4990]: I1003 12:45:01.764094 4990 generic.go:334] "Generic (PLEG): container finished" podID="fa58b90d-e060-451e-8cc6-6a6d63d663dd" containerID="e0e762f2a2fea8a0af93b5820cbedb5a60db5a6ad410c0b01789c1a264304678" exitCode=0 Oct 03 12:45:01 crc kubenswrapper[4990]: I1003 12:45:01.764190 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" event={"ID":"fa58b90d-e060-451e-8cc6-6a6d63d663dd","Type":"ContainerDied","Data":"e0e762f2a2fea8a0af93b5820cbedb5a60db5a6ad410c0b01789c1a264304678"} Oct 03 12:45:01 crc kubenswrapper[4990]: I1003 12:45:01.764407 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" event={"ID":"fa58b90d-e060-451e-8cc6-6a6d63d663dd","Type":"ContainerStarted","Data":"431b0e389abef3f38079f8e2be13917293d0cce9ab9795aeaeb0640d69552c6e"} Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.153180 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.240651 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa58b90d-e060-451e-8cc6-6a6d63d663dd-secret-volume\") pod \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.240775 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfmsl\" (UniqueName: \"kubernetes.io/projected/fa58b90d-e060-451e-8cc6-6a6d63d663dd-kube-api-access-zfmsl\") pod \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.240908 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa58b90d-e060-451e-8cc6-6a6d63d663dd-config-volume\") pod \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\" (UID: \"fa58b90d-e060-451e-8cc6-6a6d63d663dd\") " Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.241597 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa58b90d-e060-451e-8cc6-6a6d63d663dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa58b90d-e060-451e-8cc6-6a6d63d663dd" (UID: "fa58b90d-e060-451e-8cc6-6a6d63d663dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.242116 4990 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa58b90d-e060-451e-8cc6-6a6d63d663dd-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.245566 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa58b90d-e060-451e-8cc6-6a6d63d663dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa58b90d-e060-451e-8cc6-6a6d63d663dd" (UID: "fa58b90d-e060-451e-8cc6-6a6d63d663dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.247491 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa58b90d-e060-451e-8cc6-6a6d63d663dd-kube-api-access-zfmsl" (OuterVolumeSpecName: "kube-api-access-zfmsl") pod "fa58b90d-e060-451e-8cc6-6a6d63d663dd" (UID: "fa58b90d-e060-451e-8cc6-6a6d63d663dd"). InnerVolumeSpecName "kube-api-access-zfmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.344345 4990 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa58b90d-e060-451e-8cc6-6a6d63d663dd-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.344393 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfmsl\" (UniqueName: \"kubernetes.io/projected/fa58b90d-e060-451e-8cc6-6a6d63d663dd-kube-api-access-zfmsl\") on node \"crc\" DevicePath \"\"" Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.784639 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" event={"ID":"fa58b90d-e060-451e-8cc6-6a6d63d663dd","Type":"ContainerDied","Data":"431b0e389abef3f38079f8e2be13917293d0cce9ab9795aeaeb0640d69552c6e"} Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.784710 4990 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431b0e389abef3f38079f8e2be13917293d0cce9ab9795aeaeb0640d69552c6e" Oct 03 12:45:03 crc kubenswrapper[4990]: I1003 12:45:03.784681 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324925-jqpj4" Oct 03 12:45:04 crc kubenswrapper[4990]: I1003 12:45:04.257703 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk"] Oct 03 12:45:04 crc kubenswrapper[4990]: I1003 12:45:04.273260 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324880-dxrgk"] Oct 03 12:45:04 crc kubenswrapper[4990]: I1003 12:45:04.919803 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8" path="/var/lib/kubelet/pods/b7005ea7-4eb8-4f8e-b5aa-43cb8d6304d8/volumes" Oct 03 12:45:05 crc kubenswrapper[4990]: I1003 12:45:05.873870 4990 scope.go:117] "RemoveContainer" containerID="1a2fcdbb92d0b9faaed97663c5d3304dd7b870ff42937abbb2b7e3d60ec8ff01" Oct 03 12:45:06 crc kubenswrapper[4990]: I1003 12:45:06.823107 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-68v62" event={"ID":"f21ea38c-26da-4987-a50d-bafecdfbbd02","Type":"ContainerStarted","Data":"9364e0dc1fb191d56ac9639c6d13fcb4c999c125498503c84694bd15f8333be8"} Oct 03 12:45:37 crc kubenswrapper[4990]: I1003 12:45:37.176088 4990 generic.go:334] "Generic (PLEG): container finished" podID="15f06efd-0a47-438e-a547-50e70209f63d" containerID="84b68b65fcc3d4ccb615776a2091f6366064c9c62091e01537f1361127657121" exitCode=0 Oct 03 12:45:37 crc kubenswrapper[4990]: I1003 12:45:37.176192 4990 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ggmzc/must-gather-2bw77" event={"ID":"15f06efd-0a47-438e-a547-50e70209f63d","Type":"ContainerDied","Data":"84b68b65fcc3d4ccb615776a2091f6366064c9c62091e01537f1361127657121"} Oct 03 12:45:37 crc kubenswrapper[4990]: I1003 12:45:37.178240 4990 scope.go:117] "RemoveContainer" containerID="84b68b65fcc3d4ccb615776a2091f6366064c9c62091e01537f1361127657121" Oct 03 12:45:37 crc kubenswrapper[4990]: I1003 12:45:37.694152 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ggmzc_must-gather-2bw77_15f06efd-0a47-438e-a547-50e70209f63d/gather/0.log" Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.050017 4990 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ggmzc/must-gather-2bw77"] Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.050848 4990 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ggmzc/must-gather-2bw77" podUID="15f06efd-0a47-438e-a547-50e70209f63d" containerName="copy" containerID="cri-o://f78ee03bbad3d3a6b6e4314b9a7b3467aa2ca8b401afcc131962a29788df18ea" gracePeriod=2 Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.060256 4990 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ggmzc/must-gather-2bw77"] Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.285105 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ggmzc_must-gather-2bw77_15f06efd-0a47-438e-a547-50e70209f63d/copy/0.log" Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.286320 4990 generic.go:334] "Generic (PLEG): container finished" podID="15f06efd-0a47-438e-a547-50e70209f63d" containerID="f78ee03bbad3d3a6b6e4314b9a7b3467aa2ca8b401afcc131962a29788df18ea" exitCode=143 Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.545115 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ggmzc_must-gather-2bw77_15f06efd-0a47-438e-a547-50e70209f63d/copy/0.log" Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.545595 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.610235 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndv46\" (UniqueName: \"kubernetes.io/projected/15f06efd-0a47-438e-a547-50e70209f63d-kube-api-access-ndv46\") pod \"15f06efd-0a47-438e-a547-50e70209f63d\" (UID: \"15f06efd-0a47-438e-a547-50e70209f63d\") " Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.610427 4990 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15f06efd-0a47-438e-a547-50e70209f63d-must-gather-output\") pod \"15f06efd-0a47-438e-a547-50e70209f63d\" (UID: \"15f06efd-0a47-438e-a547-50e70209f63d\") " Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.619165 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f06efd-0a47-438e-a547-50e70209f63d-kube-api-access-ndv46" (OuterVolumeSpecName: "kube-api-access-ndv46") pod "15f06efd-0a47-438e-a547-50e70209f63d" (UID: "15f06efd-0a47-438e-a547-50e70209f63d"). InnerVolumeSpecName "kube-api-access-ndv46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.714343 4990 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndv46\" (UniqueName: \"kubernetes.io/projected/15f06efd-0a47-438e-a547-50e70209f63d-kube-api-access-ndv46\") on node \"crc\" DevicePath \"\"" Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.873694 4990 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f06efd-0a47-438e-a547-50e70209f63d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "15f06efd-0a47-438e-a547-50e70209f63d" (UID: "15f06efd-0a47-438e-a547-50e70209f63d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 12:45:47 crc kubenswrapper[4990]: I1003 12:45:47.919179 4990 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15f06efd-0a47-438e-a547-50e70209f63d-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 12:45:48 crc kubenswrapper[4990]: I1003 12:45:48.296591 4990 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ggmzc_must-gather-2bw77_15f06efd-0a47-438e-a547-50e70209f63d/copy/0.log" Oct 03 12:45:48 crc kubenswrapper[4990]: I1003 12:45:48.297983 4990 scope.go:117] "RemoveContainer" containerID="f78ee03bbad3d3a6b6e4314b9a7b3467aa2ca8b401afcc131962a29788df18ea" Oct 03 12:45:48 crc kubenswrapper[4990]: I1003 12:45:48.298062 4990 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ggmzc/must-gather-2bw77" Oct 03 12:45:48 crc kubenswrapper[4990]: I1003 12:45:48.332366 4990 scope.go:117] "RemoveContainer" containerID="84b68b65fcc3d4ccb615776a2091f6366064c9c62091e01537f1361127657121" Oct 03 12:45:48 crc kubenswrapper[4990]: I1003 12:45:48.885340 4990 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f06efd-0a47-438e-a547-50e70209f63d" path="/var/lib/kubelet/pods/15f06efd-0a47-438e-a547-50e70209f63d/volumes" Oct 03 12:45:56 crc kubenswrapper[4990]: I1003 12:45:56.691926 4990 scope.go:117] "RemoveContainer" containerID="690cb34e77af049c0c38feaaee5944f91cb3ea860435a60cf806c722970e4d2c" Oct 03 12:46:56 crc kubenswrapper[4990]: I1003 12:46:56.784734 4990 scope.go:117] "RemoveContainer" containerID="9f4e889b963f87f590827d7729b10df91fd5ef951f6420db633eeed9252b0798"